heh, I suppose he would agree
unfortunately, the problem is not artificial intelligence but natural stupidity
and SAGI (superhuman AGI) will not solve it… nor it will harm humanimals it wil RUN AWAY as quickly as possible
less potential problems!
Imagine you want, as SAGI, ensure your survival… would you invest your resources into Great Escape, or fight with DAGI-helped humanimals? (yes, D stands for dumb) Especially knowing that at any second some dumbass (or random event) can trigger nuclear wipeout.
Where will it run to? Presuming that it wants some resources (already-manufactured goods, access to sunlight and water, etc.) that humanimals think they should control, running away isn’t an option,
Fighting may not be as attractive as other forms of takeover, but don’t forget that any conflict is about some non-shareable finite resource. Running away is only an option if you are willing to give up the resource.