I think there are good reasons to expect large fractions of humans might die even if humans immediately surrender:
It might be an unstable position given that the AI has limited channels of influence on the physical world. (While if there are far fewer humans, this changes.)
The AI might not care that much or might be myopic or might have arbitrary other motivations etc.
I think there are good reasons to expect large fractions of humans might die even if humans immediately surrender:
It might be an unstable position given that the AI has limited channels of influence on the physical world. (While if there are far fewer humans, this changes.)
The AI might not care that much or might be myopic or might have arbitrary other motivations etc.