When I read this I wasn’t 100% sure I grokked the magnanimous point.
Then an anecdote happened to me. :) :/
I recently (due to Alpha Go Zero) gained the system-1 alief that “there is actually a 10%+ chance that we’re all dead in 15 years.” For the past 6 years I’ve basically system-2 believed this, but there was sufficient wiggle room given Raemon’s-crude-understanding-of-AI-knowledge that it was *much* more like “okay, I can see how it could happen, and smart monkeys I trust seem to think it’s likely”, but ultimately the argument was built off of metaphor and guesses about reference class that I wasn’t forced to take seriously.
Post Alpha-Go-Zero, I have a concrete sense—I see the gears locking into place and how they fit together. They are still sticking out of some black boxes, but I see the indisputable evidence that it is more likely for the black-box to be Fast Takeoff shaped than Slow Takeoff shaped.
The first thing I felt was an abstract “woah.”
The second thing I felt was a general shaking scared-ness.
The third thing was “I want to talk to my parents about this.” I want a sit down conversation that goes something like “okay, Mom, Dad, I know I’ve been spouting increasingly weird stuff over the past 6 years, and for the most part we’ve sort of agreed to not stress out too much about it nowadays. But, like, I actually believe there’s a 10% chance that we all die in 10 years, and I believe it for reasons that I think are in principle possible for me to explain to you.
And… it’s okay if you still don’t actually share this belief with and it’s definitely okay if, even if you do belief it you mostly shrug and go about your business because what can you do? But, it’s really important to me that I at least try to talk about this and that you at least try to listen.”
This doesn’t feel quite shaped like the Magnanimous Error as you describe it, but I’m curious if it feels like it’s pointing at a similar phenomena.
When I read this I wasn’t 100% sure I grokked the magnanimous point.
Then an anecdote happened to me. :) :/
I recently (due to Alpha Go Zero) gained the system-1 alief that “there is actually a 10%+ chance that we’re all dead in 15 years.” For the past 6 years I’ve basically system-2 believed this, but there was sufficient wiggle room given Raemon’s-crude-understanding-of-AI-knowledge that it was *much* more like “okay, I can see how it could happen, and smart monkeys I trust seem to think it’s likely”, but ultimately the argument was built off of metaphor and guesses about reference class that I wasn’t forced to take seriously.
Post Alpha-Go-Zero, I have a concrete sense—I see the gears locking into place and how they fit together. They are still sticking out of some black boxes, but I see the indisputable evidence that it is more likely for the black-box to be Fast Takeoff shaped than Slow Takeoff shaped.
The first thing I felt was an abstract “woah.”
The second thing I felt was a general shaking scared-ness.
The third thing was “I want to talk to my parents about this.” I want a sit down conversation that goes something like “okay, Mom, Dad, I know I’ve been spouting increasingly weird stuff over the past 6 years, and for the most part we’ve sort of agreed to not stress out too much about it nowadays. But, like, I actually believe there’s a 10% chance that we all die in 10 years, and I believe it for reasons that I think are in principle possible for me to explain to you.
And… it’s okay if you still don’t actually share this belief with and it’s definitely okay if, even if you do belief it you mostly shrug and go about your business because what can you do? But, it’s really important to me that I at least try to talk about this and that you at least try to listen.”
This doesn’t feel quite shaped like the Magnanimous Error as you describe it, but I’m curious if it feels like it’s pointing at a similar phenomena.
Loren ipsum