I think some of these are pretty reasonable points, but I am kind of confused by the following:
This Leverage drama is not important to anyone except a small group of people and does not belong on LW. Perhaps the relatively small group of Bay Area rationalists who are always at the center of these things need to create a separate forum for their own drama. Nobody outside of Berkeley needs to hear about this. This sort of thing gets upvoted because tribal instincts are being activated, not because this is good and ought to be here.
It seems to me that Leverage had a large and broad effect on the Effective Altruism and Rationality communities worldwide, with having organized the 2013-2014 EA Summits, and having provided a substantial fraction of the strategic direction for EAG 2015 and EAG 2016, and then shared multiple staff with the Centre For Effective Altruism until 2019.
This suggests to me that what happened at Leverage clearly had effects that are much broader reaching than “some relatively small group of Bay Area rationalists”. Indeed, I think the Bay Area rationality community wasn’t that affected by all the stuff happening at Leverage, and the effects seemed much more distributed.
Maybe you also think all the EA Summit and EA Global conferences didn’t matter? Which seems like a fine take. Or maybe you think how CEA leadership worked also didn’t matter, which also seems fine. But I do think these both aren’t obvious takes, and I think I disagree with both of them.
“Problematic dynamics happened at Leverage” and “Leverage influenced EA Summit/Global” don’t imply “Problematic dynamics at Leverage influenced EA Summit/Global” if EA Summit/Global had their own filters against problematic influences. (If such filters failed, it should be possible to point out where.)
I donate a meaningful amount to CFAR and MIRI (without being overly specific, >1% of my income to those two orgs), and check LW weekly-ish, and I had never even heard of Leverage until the recent kerfuffle. Anecdote isn’t data but I sort of agree with this comment’s grandparent here.
It seems to me that Leverage had a large and broad effect on the Effective Altruism and Rationality communities worldwide, with having organized the 2013-2014 EA Summits, and having provided a substantial fraction of the strategic direction for EAG 2015 and EAG 2016, and then shared multiple staff with the Centre For Effective Altruism until 2019.
For me personally this still rounds off to “not very important.” Especially in the sense that there is nothing I, or the vast majority of people on this site, could possibly do with this information. I was already never going to join Leverage, or give any money to Geoff Anders. I have a lot of rationalist friends, both IRL and online, and none of us had ever heard about Geoff Anders prior to this recent drama.
Think about it in terms of cost-benefit. The benefit of this kind of content to the vast majority of people on LW is zero. The cost is pretty high, because ~everybody who sees a big juicy drama fest is going to want to rubberneck and throw in their two cents. So on net posting content like this to the main LW feed is strongly net negative in aggregate. A post which is simply dumb/wrong but otherwise un-dramatic can at least be simply ignored.
I think that if it were, say, Yudkowsky being accused of auditing people’s thetans and having seances, I would find that relevant, because it would have implications for my future decisions.
I think some of these are pretty reasonable points, but I am kind of confused by the following:
It seems to me that Leverage had a large and broad effect on the Effective Altruism and Rationality communities worldwide, with having organized the 2013-2014 EA Summits, and having provided a substantial fraction of the strategic direction for EAG 2015 and EAG 2016, and then shared multiple staff with the Centre For Effective Altruism until 2019.
This suggests to me that what happened at Leverage clearly had effects that are much broader reaching than “some relatively small group of Bay Area rationalists”. Indeed, I think the Bay Area rationality community wasn’t that affected by all the stuff happening at Leverage, and the effects seemed much more distributed.
Maybe you also think all the EA Summit and EA Global conferences didn’t matter? Which seems like a fine take. Or maybe you think how CEA leadership worked also didn’t matter, which also seems fine. But I do think these both aren’t obvious takes, and I think I disagree with both of them.
“Problematic dynamics happened at Leverage” and “Leverage influenced EA Summit/Global” don’t imply “Problematic dynamics at Leverage influenced EA Summit/Global” if EA Summit/Global had their own filters against problematic influences. (If such filters failed, it should be possible to point out where.)
I donate a meaningful amount to CFAR and MIRI (without being overly specific, >1% of my income to those two orgs), and check LW weekly-ish, and I had never even heard of Leverage until the recent kerfuffle. Anecdote isn’t data but I sort of agree with this comment’s grandparent here.
For me personally this still rounds off to “not very important.” Especially in the sense that there is nothing I, or the vast majority of people on this site, could possibly do with this information. I was already never going to join Leverage, or give any money to Geoff Anders. I have a lot of rationalist friends, both IRL and online, and none of us had ever heard about Geoff Anders prior to this recent drama.
Think about it in terms of cost-benefit. The benefit of this kind of content to the vast majority of people on LW is zero. The cost is pretty high, because ~everybody who sees a big juicy drama fest is going to want to rubberneck and throw in their two cents. So on net posting content like this to the main LW feed is strongly net negative in aggregate. A post which is simply dumb/wrong but otherwise un-dramatic can at least be simply ignored.
I think that if it were, say, Yudkowsky being accused of auditing people’s thetans and having seances, I would find that relevant, because it would have implications for my future decisions.