Apologies, I did misread your original causality claim.
FPVs are less “air force” and more “precision munitions”. You can think of them as of a new “crewed ATGM” variant, command guidance and all.
They work great for precision ground-to-ground strikes, but play little role in what is meant by “air supremacy”. They can’t pose a meaningful threat to most air platforms, and most air platforms can’t effectively hit them. They do nothing to deny US the ability to perform CAS or otherwise hit targets from air.
The main exception to that is helicopters, for the same reasons why ATGMs can pose a threat to helicopters in some circumstances. Specialized FPV interceptors, in hands of skilled operators, can also hit other drones, including heavier fixed wing drones like Shahed or even Reaper—allowing them to intrude on MANPADS territory. But the traditional “JDAM trucks” aren’t in the same bracket as FPV drones.
We also have very little information of FPV crew survivability in an environment when one of the parties has advanced ISR, ELINT included, fast kill loops, and enough air control to drop JDAMs freely. Every reason to expect more attrition on FPV crews, and skilled operators aren’t easy to replace—but quantitively, we don’t know by how much. Might be enough to make “deny the enemy most FPV ops within an area” a viable prospect, but you can’t count on it.
The power of agentic coding is that the same agent can write, build, run and test the code—and then tweak it according to that. Closed loop is what actually makes this work. Open loop sucks.
Note that this is not AI-exclusive. Human programmers also suck at producing working code without access to a compiler or an ability to test the code.
If you can’t run closed loop for some reason, then, do the cheap stupid version of it and paste the errors you get into the AI’s chat window.
The dirty little secret is that “quality assurance” on code borders on non-existent in a solid 60% of the cases—enterprise or no enterprise, AI or no AI. Features ship broken and failures surface in prod. The same mitigations that apply to human-induced faults apply to AI-induced faults—assuming anyone gives enough of a fuck to have any.