Would it have been better to use a title that fewer people would feel the need to disclaim?
I think Eliezer and Nate are basically correct to believe the overwhelming likelihood if someone built “It” would be everyone dying.
Still, maybe they should have written a book with a title that more people around these parts wouldn’t feel the need to disclaim, and that the entire x-risk community could have enthusiastically gotten behind. I think they should have at least considered that. Something more like “If anyone builds it, everyone loses.” (that title doesn’t quite work, but, you know, something like that)
My own answer is “maybe”—I see the upside. I want to note some of the downsides or counter-considerations.
(Note: I’m specifically considering this from within the epistemic state of “if you did pretty confidently believe everyone would literally die, and that if they didn’t literally die, the thing that happened instead would be catastrophically bad for most people’s values and astronomically bad from Eliezer/Nate’s values)
Counter-considerations include:
AFAICT, Eliezer and Nate spent like ~8 years deliberately backing off and toning tone, out of a vague deferral to people saying “guys you suck at PR and being the public faces of this movement.” The result of this was (from their perspective) “EA gets co-opted by OpenAI, which launches a race that dramatically increases the danger the world faces.”
So, the background context here is that they have tried more epistemic-prisoner’s-dilemma-cooperative-ish strategies, and they haven’t worked well.
Also, it seems like there’s a large industrial complex of people arguing for various flavors of “things are pretty safe”, and there’s barely anyone at all stating plainly “IABED”. MIRI’s overall strategy right now is to speak plainly about what they believe, both because they think it needs to be said and no one else is saying it, and because they hope just straightforwardly saying what they believe will net a reputation for candor that you don’t get if people get a whiff of you trying to modulate your beliefs based on public perception.
None of that is an argument that they should exaggerate or lean-extra-into beliefs that they don’t endorse. But, given that they are confident about it, it’s an argument not to go out of their way to try to say something else.
Section I just added: