6. Celebrity helps launder taboo ideology. If you believe Muslim immigration is threatening, you might not be willing to say that aloud – especially if you’re an ordinary person who often trips on their tongue, and the precise words you use are the difference between “mainstream conservative belief” and “evil bigot who must be fired immediately”. Saying “I am really into Sam Harris” both leaves a lot of ambiguity, and lets you outsource the not-saying-the-wrong-word-and-getting-fired work to a professional who’s good at it. In contrast, if your belief is orthodox and you expect it to win you social approval, you want to be as direct as possible.
I don’t think that admitting to believe in AI X-risk would get you labelled as evil, but it could possibly get you labelled as crazy (which is maybe worse than evil for intellectuals?). To be able to publicly admit to that view, you either have to be able to argue for it really well, or be able to outsource that arguing to someone else.
I don’t know why this is happening with this book in particular though, since there are currently lots of books, blogposts, YouTube videos, etc, that explains AI X-risk. Maybe non of the previous work where good enough? Or respectable enough? Or advertised enough?
I think what is going on is something like what Scott describe in Can Things Be Both Popular And Silenced?
I don’t think that admitting to believe in AI X-risk would get you labelled as evil, but it could possibly get you labelled as crazy (which is maybe worse than evil for intellectuals?). To be able to publicly admit to that view, you either have to be able to argue for it really well, or be able to outsource that arguing to someone else.
I don’t know why this is happening with this book in particular though, since there are currently lots of books, blogposts, YouTube videos, etc, that explains AI X-risk. Maybe non of the previous work where good enough? Or respectable enough? Or advertised enough?