Do you think if an AI with random goals that doesn’t get acausally paid to preserve us takes over, then there’s a meaningful chance there will be some humans around in 100 years? What does it look like?
“Random goals” is a crux. Complicated goals that we can’t control well enough to prevent takeover are not necessarily uniformly random goals from whatever space you have in mind.
Do you think if an AI with random goals that doesn’t get acausally paid to preserve us takes over, then there’s a meaningful chance there will be some humans around in 100 years? What does it look like?
“Random goals” is a crux. Complicated goals that we can’t control well enough to prevent takeover are not necessarily uniformly random goals from whatever space you have in mind.