A single generation difference in military technology is an overwhelming advantage. The JSF F35 Lockheed Martin Lightning II cannot be missile-locked by an adversary beyond 20-30 miles. Conversely, it can see and weapon lock an opposing 4th gen fighter from >70 miles fire a beyond-visual-range missile that is almost impossible to evade for a manned fighter.
It is not at all unlikely to suppose that a machine superintelligence could not only rapidly design new materials, artificial organisms and military technologies vastly better than those constructed by humans today. These could indeed be said to form superweapons.
The idea that AI-designed nanomachines will outcompete bacteria and consume the world in a grey goo swarm perhaps may seem fanciful but that’s not at all evidence that it isn’t in the cards. Now, there are goodish technical arguments that bacteria are already at various thermodynamic limits. As bhauth notes it seems that Yudkowsky underrates the ability of evolution-by-natural-selection to find highly optimal structures.
However, I don’t see this enough evidence to prohibiting grey goo scenarios. Being somewhere at a Pareto optimum doesn’t mean you can’t be outcompeted. Evolution is much more efficient than it is sometimes given credit for but it still seems to miss obvious improvements.
Of course, nanotech is likely a superweapon even without grey goo scenarios so this is only a possible extreme. And finally of course (a) mechanical superintelligence(s) posesses many advantages over biological humans any of which may prove more relevant for a take-over scenario in the short-term.
A single generation difference in military technology is an overwhelming advantage. The JSF F35 Lockheed Martin Lightning II cannot be missile-locked by an adversary beyond 20-30 miles. Conversely, it can see and weapon lock an opposing 4th gen fighter from >70 miles fire a beyond-visual-range missile that is almost impossible to evade for a manned fighter.
In realistic scenarios with adequate preparation and competent deployment a generation difference in aircraft can lead to 20⁄1 K/D ratios. 5th generations fighters are much better than 4th generation fighters are much better than 3rd generation fighters etc. Same for tanks, ships, artillery, etc. This difference is primarily technological.
It is not at all unlikely to suppose that a machine superintelligence could not only rapidly design new materials, artificial organisms and military technologies vastly better than those constructed by humans today. These could indeed be said to form superweapons.
The idea that AI-designed nanomachines will outcompete bacteria and consume the world in a grey goo swarm perhaps may seem fanciful but that’s not at all evidence that it isn’t in the cards. Now, there are goodish technical arguments that bacteria are already at various thermodynamic limits. As bhauth notes it seems that Yudkowsky underrates the ability of evolution-by-natural-selection to find highly optimal structures.
However, I don’t see this enough evidence to prohibiting grey goo scenarios. Being somewhere at a Pareto optimum doesn’t mean you can’t be outcompeted. Evolution is much more efficient than it is sometimes given credit for but it still seems to miss obvious improvements.
Of course, nanotech is likely a superweapon even without grey goo scenarios so this is only a possible extreme. And finally of course (a) mechanical superintelligence(s) posesses many advantages over biological humans any of which may prove more relevant for a take-over scenario in the short-term.