I think it’s pretty bizarre that despite the fact that LessWrongers are usually acutely aware of the epistemic downsides of being an activist, they seem to have paid relatively little attention to this in their recent transition to activism.
FWIW I’m the primary organizer of PauseAI UK and I’ve thought about this a lot.
However, I also feel that like, for the last 10 years, the reverse facet of that point has been argued all the time ad nauseam, both between people on lw, and as criticism coming from outside the community. “People on lesswrong care about epistemic purity, and they will therefore never ever get anything done in the real world. Its easy to have pure epistemics if you’re just sitting with your friends thinking about philosophy. If lw really cared about saving the world they would stop with ‘politics is mindkiller’ and ‘scout mindset’ and start actually trying to win.”.
And I think that criticism has some validity. “The right amount of politics is not zero, even though it really is the mind killer”.
But I also think, arguments for taking AI x-risk very seriously, are unusually strong compared with most political debates. Its an argument we should be able to win even speaking only speaking the whole truth. And in some sense, it can’t become an “ordinary” political issue, then the action will not be swift and decisive enough. And if people start making a lot of, even if not false, misleading statements, the risk of that becomes very high.
FWIW I’m the primary organizer of PauseAI UK and I’ve thought about this a lot.
I agree with Bucks statement.
However, I also feel that like, for the last 10 years, the reverse facet of that point has been argued all the time ad nauseam, both between people on lw, and as criticism coming from outside the community. “People on lesswrong care about epistemic purity, and they will therefore never ever get anything done in the real world. Its easy to have pure epistemics if you’re just sitting with your friends thinking about philosophy. If lw really cared about saving the world they would stop with ‘politics is mindkiller’ and ‘scout mindset’ and start actually trying to win.”.
And I think that criticism has some validity. “The right amount of politics is not zero, even though it really is the mind killer”.
But I also think, arguments for taking AI x-risk very seriously, are unusually strong compared with most political debates. Its an argument we should be able to win even speaking only speaking the whole truth. And in some sense, it can’t become an “ordinary” political issue, then the action will not be swift and decisive enough. And if people start making a lot of, even if not false, misleading statements, the risk of that becomes very high.