The idea is that there are lots of resources that superhuman AGI would be in competition with humans for if it didn’t share our ideas about how those resources should be used. The biggest one is probably energy (more precisely, thermodynamic free energy). That’s very useful, it’s a requirement for doing just about anything in this universe. So AGI going about its business without any regard for humans would be doings things like setting up a Dyson sphere around the sun, maybe building large fusion reactors all over the Earth’s surface. The Dyson sphere would deprive us of the sunlight we need to live, while the fusion reactors might produce enough waste heat to make the planet’s surface uninhabitable to humans. AGI going about its business with no regard for humans has no reason to make sure we survive.
That’s the base case where humanity doesn’t fight back. If (some) humans figure that that’s how it’s going to play out, then they’re likely to try and stop the AGI. If the AGI predicts that we’re going to fight back, then maybe it can just ignore us like tiny ants, or maybe it’s simpler and safer for it to deliberately kill everyone at once, so we don’t do complicated and hard-to-predict things to mess up its plans.
TLDR: Even if we don’t fight back, AGI does things with the side effect of killing us. Therefore we probably fight back if we notice an AGI doing that. A potential strategy for the AGI to deal with this is to kill everyone before we have a chance to notice.
The idea is that there are lots of resources that superhuman AGI would be in competition with humans for if it didn’t share our ideas about how those resources should be used. The biggest one is probably energy (more precisely, thermodynamic free energy). That’s very useful, it’s a requirement for doing just about anything in this universe. So AGI going about its business without any regard for humans would be doings things like setting up a Dyson sphere around the sun, maybe building large fusion reactors all over the Earth’s surface. The Dyson sphere would deprive us of the sunlight we need to live, while the fusion reactors might produce enough waste heat to make the planet’s surface uninhabitable to humans. AGI going about its business with no regard for humans has no reason to make sure we survive.
That’s the base case where humanity doesn’t fight back. If (some) humans figure that that’s how it’s going to play out, then they’re likely to try and stop the AGI. If the AGI predicts that we’re going to fight back, then maybe it can just ignore us like tiny ants, or maybe it’s simpler and safer for it to deliberately kill everyone at once, so we don’t do complicated and hard-to-predict things to mess up its plans.
TLDR: Even if we don’t fight back, AGI does things with the side effect of killing us. Therefore we probably fight back if we notice an AGI doing that. A potential strategy for the AGI to deal with this is to kill everyone before we have a chance to notice.