Love your ring analogy, find it quite pertinent (and catching myself as thinking, half-consciously, just a bit as you suggest indeed)
But re
[..] gave the terrible speech [..] He effectively one-shot everyone of a particular nerdish variety into becoming obsessed with the power of the Ring/AI,
to express it using your analogy itself: I’m pretty sure, instead, one way or other, the Ring would by now have found its way to sway us even in the absence of Eliezer.
I can’t take credit for it, I think I saw it first on this thread, where someone points out that Yudkowsky is one of the few to have passed the ring temptation test. Then I thought, “how would this metaphor actually play out...” and then ended up at my shortform post.
I agree that people would find the idea of AGI alluring and seductive even without Yudkowsky. As I said, I admire him very much for stating his beliefs honestly and with conviction and effectiveness. I find it sad that despite that the ring-temptation is such that even earnest warnings can be flipped into “well, I could be the special one” narratives. But as I said I’m also moving away from the AGI as “worst-possible-godly-alien-superintelligence” framing as a whole.
Love your ring analogy, find it quite pertinent (and catching myself as thinking, half-consciously, just a bit as you suggest indeed)
But re
to express it using your analogy itself: I’m pretty sure, instead, one way or other, the Ring would by now have found its way to sway us even in the absence of Eliezer.
I can’t take credit for it, I think I saw it first on this thread, where someone points out that Yudkowsky is one of the few to have passed the ring temptation test. Then I thought, “how would this metaphor actually play out...” and then ended up at my shortform post.
I agree that people would find the idea of AGI alluring and seductive even without Yudkowsky. As I said, I admire him very much for stating his beliefs honestly and with conviction and effectiveness. I find it sad that despite that the ring-temptation is such that even earnest warnings can be flipped into “well, I could be the special one” narratives. But as I said I’m also moving away from the AGI as “worst-possible-godly-alien-superintelligence” framing as a whole.