I doubt this outcome greatly. I expect we’d have lost a more conventional hot war a long time before a micro-scale hot war could start so suddenly. If AI is trending this direction, the lead up to the war wouldn’t look like nothing and then everything—it’d look like incrementally more conventional weapons, propaganda, and cyberweapons in use between competing ai-aided human factions as the best of software, drones, and maybe bioweapons are used to attack international ai groups. Perhaps sneaky bioweapons would also get used—what you call nanotech. You’re describing the old model of a sudden singleton, but becoming all one wouldn’t have such a simple trajectory. New forms of life eating humans alive would likely look more like toxoplasmosis and then cordyceps rather than like flesh-eating bacteria or grey goo, because the latter is very thermodynamically inefficient; an ai that strong wouldn’t waste energy creating unnecessary heat, and would rather convert our forms to its intended forms as slowly as possible to guarantee control.
But do not mistake me for saying that we are safe from AI-powered pivotal acts of power capture. I claim only that if we are on a trajectory headed towards the last collapse of dna-backed biology, there would be many attempted pivotal acts earlier in the process. humanity would have been almost entirely disempowered and aware of it by the time the last loss occurred. we’ll have a lot longer than this story implies to watch ourselves die, with increasingly high probability of permanent doom as we sink below AI’s reproductive fitness.
Oh, a lot of what I wrote is for ‘cinematic’ effect and symbolism. Maybe tagging it as “Rationalist fic” made it seems like this was a prediction; I changed it for just “fiction” and added a note.
Coming back to this three years later… I think from the perspective of someone getting the payout from AI, you’re closer to right than wrong. from the perspective of someone being abandoned by the economy, I think we’re already seeing things that could turn into me ending up right, but ehh, sudden takeover seems less implausible to me once we’ve reached apparently-aligned AGI and are heading towards starkly superintelligent, defeat-humanity-and-all-other-AIs-combined-even-if-limited-to-100-watts AI.
I doubt this outcome greatly. I expect we’d have lost a more conventional hot war a long time before a micro-scale hot war could start so suddenly. If AI is trending this direction, the lead up to the war wouldn’t look like nothing and then everything—it’d look like incrementally more conventional weapons, propaganda, and cyberweapons in use between competing ai-aided human factions as the best of software, drones, and maybe bioweapons are used to attack international ai groups. Perhaps sneaky bioweapons would also get used—what you call nanotech. You’re describing the old model of a sudden singleton, but becoming all one wouldn’t have such a simple trajectory. New forms of life eating humans alive would likely look more like toxoplasmosis and then cordyceps rather than like flesh-eating bacteria or grey goo, because the latter is very thermodynamically inefficient; an ai that strong wouldn’t waste energy creating unnecessary heat, and would rather convert our forms to its intended forms as slowly as possible to guarantee control.
But do not mistake me for saying that we are safe from AI-powered pivotal acts of power capture. I claim only that if we are on a trajectory headed towards the last collapse of dna-backed biology, there would be many attempted pivotal acts earlier in the process. humanity would have been almost entirely disempowered and aware of it by the time the last loss occurred. we’ll have a lot longer than this story implies to watch ourselves die, with increasingly high probability of permanent doom as we sink below AI’s reproductive fitness.
Oh, a lot of what I wrote is for ‘cinematic’ effect and symbolism. Maybe tagging it as “Rationalist fic” made it seems like this was a prediction; I changed it for just “fiction” and added a note.
But I appreciate your input/perspective!
Coming back to this three years later… I think from the perspective of someone getting the payout from AI, you’re closer to right than wrong. from the perspective of someone being abandoned by the economy, I think we’re already seeing things that could turn into me ending up right, but ehh, sudden takeover seems less implausible to me once we’ve reached apparently-aligned AGI and are heading towards starkly superintelligent, defeat-humanity-and-all-other-AIs-combined-even-if-limited-to-100-watts AI.