I’ve thought about what would happen if various AI players got their hands on Godlike powers via ASI. My subjective impressions, from least to most troubling:
Demis Hassabis—Best possible outcome. Utopia for all of us. Something like Metamorphosis of Prime Intellect. Way better than a random unaligned superintelligence, and way better than aligned ASI controlled by a committee of world governments.
Elon Musk—Concerning but okay. We’ll probably get utopia, and it’s probably still better than a random unaligned ASI or one controlled by governments, but Musk will have some greatly elevated position and you won’t want to piss him off. His enemies from the pre-AGI days will be in danger. I would enjoy his world but do my best to avoid ever attracting his attention.
Dario Amodei—I don’t want this. We’ll probably get a situation vastly better than the present world, but that is a low bar and I would rather take my chances with an ASI controlled by a broad committee, or maybe even a random unaligned one. Amodei seems to have strong principles, but ones that are fairly alien to my own, and which take a great interest in me and how I live. That’s a troubling combination, especially with immortality on the cards.
Sam Altman—We’re dead within days. A coldly rational actor with ASI kills everyone else as quickly as possible, knowing that they can always be recreated once said actor has put safeguards in place to ensure no one else can build ASI. (This is the same sort of behavior that reliably emerges with governments and nuclear weapons. Individuals have exactly the same incentives as governments, but we usually don’t see this because individuals are much less powerful and therefore have totally different circumstances.) I have zero doubt that Sam understands this simple game theory, and have never seen evidence from him of any deeper principles or desires that would cause him to act against his incentives in this case.
All that being said, I think the idea of one man controlling humanity’s future is looking pretty implausible at the present, and if it does happen it’s probably coming late enough that the man in question hasn’t been born yet. This stuff feels pretty firmly in the realm of sci-fi speculations.
I’ve thought about what would happen if various AI players got their hands on Godlike powers via ASI. My subjective impressions, from least to most troubling:
Demis Hassabis—Best possible outcome. Utopia for all of us. Something like Metamorphosis of Prime Intellect. Way better than a random unaligned superintelligence, and way better than aligned ASI controlled by a committee of world governments.
Elon Musk—Concerning but okay. We’ll probably get utopia, and it’s probably still better than a random unaligned ASI or one controlled by governments, but Musk will have some greatly elevated position and you won’t want to piss him off. His enemies from the pre-AGI days will be in danger. I would enjoy his world but do my best to avoid ever attracting his attention.
Dario Amodei—I don’t want this. We’ll probably get a situation vastly better than the present world, but that is a low bar and I would rather take my chances with an ASI controlled by a broad committee, or maybe even a random unaligned one. Amodei seems to have strong principles, but ones that are fairly alien to my own, and which take a great interest in me and how I live. That’s a troubling combination, especially with immortality on the cards.
Sam Altman—We’re dead within days. A coldly rational actor with ASI kills everyone else as quickly as possible, knowing that they can always be recreated once said actor has put safeguards in place to ensure no one else can build ASI. (This is the same sort of behavior that reliably emerges with governments and nuclear weapons. Individuals have exactly the same incentives as governments, but we usually don’t see this because individuals are much less powerful and therefore have totally different circumstances.) I have zero doubt that Sam understands this simple game theory, and have never seen evidence from him of any deeper principles or desires that would cause him to act against his incentives in this case.
All that being said, I think the idea of one man controlling humanity’s future is looking pretty implausible at the present, and if it does happen it’s probably coming late enough that the man in question hasn’t been born yet. This stuff feels pretty firmly in the realm of sci-fi speculations.