I think if anyone builds Overwhelmed Superintelligence without hitting a pretty narrow alignment target, everyone probably dies.
I fear that even in most of the narrow cases where the superintelligence is controlled, we’re probably still pretty thoroughly screwed. Because then you need to ask, “Precisely who controls it?” Given a choice between Anthropic totally losing control of a future Claude, and Sam Altman having tight personal control over GPT Omega (“The last GPT you’ll ever build, humans”), which scenario is actually the most scary? (If you have a lot of personal trust in Sam Altman, substitute your least favorite AI lab CEO or a small committee of powerful politicians from a party you dislike.)
also because sharing the planet with a slightly smarter species still doesn’t seem like it bodes well. (See humans, neanderthals, chimpanzees).
Yeah, unless you believe in ridiculously strong forms of alignment, and unprecedentedly good political systems to control the AIs, the whole situation seems horribly unstable. I’m slightly more optimistic about early AGI alignment than Yudkowsky, but I actually might be more pessimistic about the long term.
I fear that even in most of the narrow cases where the superintelligence is controlled, we’re probably still pretty thoroughly screwed. Because then you need to ask, “Precisely who controls it?” Given a choice between Anthropic totally losing control of a future Claude, and Sam Altman having tight personal control over GPT Omega (“The last GPT you’ll ever build, humans”), which scenario is actually the most scary? (If you have a lot of personal trust in Sam Altman, substitute your least favorite AI lab CEO or a small committee of powerful politicians from a party you dislike.)
Yeah, unless you believe in ridiculously strong forms of alignment, and unprecedentedly good political systems to control the AIs, the whole situation seems horribly unstable. I’m slightly more optimistic about early AGI alignment than Yudkowsky, but I actually might be more pessimistic about the long term.