[note: I’m not particularly EA, beyond the motte of caring about others and wanting my activities to be effective. ]
I think this is basically correct. EA tends to attract outliers who are susceptible to claims of aggrandizement—telling themselves and being told they’re the heroes in the story. It reinforces this with contrarian-ness, especially on dimensions with clever, math-sounding legible arguments behind them. And then reinforces that “effective” is really about the biggest numbers you can plausibly multiple your wild guesses out to.
Until recently, it was all circulating in a pile of free money driven by related insanity of crypto and tech investment, which seemed to have completely forgotten that zero interest rates were unlikely to continue forever, and actually producing stuff would eventually be important.
[ epistemic status for next section: provocative devil’s advocate argument ]
The interesting question is “sure, it’s crazy, but is it wrong?” I suspect it is wrong—the multiplicative factors into the future are extremely tenuous. But in the event that this level of commitment and intensity DOES cause alignment to be solved in time, it’s arguable that all the insanity is worth it. If your advice makes the efforts less individually harmful, but also a little bit less effective, it could be a net harm to the universe.
[note: I’m not particularly EA, beyond the motte of caring about others and wanting my activities to be effective. ]
I think this is basically correct. EA tends to attract outliers who are susceptible to claims of aggrandizement—telling themselves and being told they’re the heroes in the story. It reinforces this with contrarian-ness, especially on dimensions with clever, math-sounding legible arguments behind them. And then reinforces that “effective” is really about the biggest numbers you can plausibly multiple your wild guesses out to.
Until recently, it was all circulating in a pile of free money driven by related insanity of crypto and tech investment, which seemed to have completely forgotten that zero interest rates were unlikely to continue forever, and actually producing stuff would eventually be important.
[ epistemic status for next section: provocative devil’s advocate argument ]
The interesting question is “sure, it’s crazy, but is it wrong?” I suspect it is wrong—the multiplicative factors into the future are extremely tenuous. But in the event that this level of commitment and intensity DOES cause alignment to be solved in time, it’s arguable that all the insanity is worth it. If your advice makes the efforts less individually harmful, but also a little bit less effective, it could be a net harm to the universe.