I feel like there was a mass community movement (not unanimous but substantial) from AGI-scenarios-that-Eliezer-has-in-mind to AGI-scenarios-that-Paul-has-in-mind, e.g. more belief in slow takeoff + multipolar + “What Failure Looks Like” and less belief in fast takeoff + decisive strategic advantage + recursive self-improvement + powerful agents coherently pursuing misaligned goals. This was mostly before my time, I could be misreading things, that’s just my impression. :-)
Seems true. Notably, if I have my cynical hat on (and I think I probably do?) it depended on having Paul say a bunch of things about it, and Paul had previously also established himself as a local “thinker celebrity”.
If I have my somewhat less cynical hat on, I do honestly think our status gradients do a decent job of tracking “person who is actually good at figuring things out”, such that “local thinker celebrity endorses a thing” is not just crazy, it’s a somewhat reasonable filtering mechanism. But I do think the effect is real.
I feel like there was a mass community movement (not unanimous but substantial) from AGI-scenarios-that-Eliezer-has-in-mind to AGI-scenarios-that-Paul-has-in-mind, e.g. more belief in slow takeoff + multipolar + “What Failure Looks Like” and less belief in fast takeoff + decisive strategic advantage + recursive self-improvement + powerful agents coherently pursuing misaligned goals. This was mostly before my time, I could be misreading things, that’s just my impression. :-)
Seems true. Notably, if I have my cynical hat on (and I think I probably do?) it depended on having Paul say a bunch of things about it, and Paul had previously also established himself as a local “thinker celebrity”.
If I have my somewhat less cynical hat on, I do honestly think our status gradients do a decent job of tracking “person who is actually good at figuring things out”, such that “local thinker celebrity endorses a thing” is not just crazy, it’s a somewhat reasonable filtering mechanism. But I do think the effect is real.