But if your best case scenario is “Maybe we’ll wind up as beloved house pets!”, maybe you should think carefully before building AGI.
Also because—and I already made the case elsewhere—if other people are not completely stupid and realise that you are about to unleash this thing which is very very much against most humans’ traditional values, and in fact a thing considered so horrible and evil that death would be preferable to it, you have a non-zero likelihood of finding yourself, your data centre and your entire general neighbourhood evaporated via sufficient application of thermonuclear plasma before you can push that button. Developing AGI that even merely disempowers everyone and enforces its own culture on the world forever is essentially not unlike launching a weapon of mass destruction.
Also because—and I already made the case elsewhere—if other people are not completely stupid and realise that you are about to unleash this thing which is very very much against most humans’ traditional values, and in fact a thing considered so horrible and evil that death would be preferable to it, you have a non-zero likelihood of finding yourself, your data centre and your entire general neighbourhood evaporated via sufficient application of thermonuclear plasma before you can push that button. Developing AGI that even merely disempowers everyone and enforces its own culture on the world forever is essentially not unlike launching a weapon of mass destruction.