I’m a high-karma LW member and I created an anonymous account to say this for reasons given below. Trust me on that or don’t, my arguments should stand on their own.
Way too much of this kind of self-obsessed community gossip has dominated LW in recent weeks. This stuff demands highly disproportionate attention and has turned LW into a net negative place for me to spend time on.
This Leverage drama is not important to anyone except a small group of people and does not belong on LW. Perhaps the relatively small group of Bay Area rationalists who are always at the center of these things need to create a separate forum for their own drama. Nobody outside of Berkeley needs to hear about this. This sort of thing gets upvoted because tribal instincts are being activated, not because this is good and ought to be here.
I have a much lower opinion about literally everybody even tangentially involved in this whole thing, even Anna Salamon for making the extremely bad PR choice of getting herself and her organization sucked into a completely avoidable vortex of bad publicity. At this point I am not sure that CFAR has created any value at all in recent years, all I know is that there are some vague and impossible to pin down connections to some extremely terrible-sounding people and situations. This is intended mostly as a statement, from an uninvolved bystander, about how bad the optics are here, and how much it’s negatively impacted my own subjective impression of CFAR and the Bay Area rationality community at large.
If you disagree with the above and really really feel like you need to post a top-level post about some kind of community drama, then please at least try to do a good job on it. Separately from the frequency of these posts is the issue of quality and volume. Duncan Sabien’s multiple recent posts and this incredibly long and time-consuming transcript are extremely low-effort and low-quality, the former is badly written and the later is just a transcript. If you felt like you needed to post this you could have at least provided a short summary of major points so people could determine whether they needed to read it.
You might say “Nobody is making you read it.” That’s true but misses the fact that gossip activates tribal reflexes that are very hard to fight. And anything with 100+ upvotes demands attention. I can’t tell a priori that those upvotes are more about tribal solidarity than about quality and importance. I created an anonymous account because I want to just say this and then be allowed to stop thinking about it, and not get roped into the whole tribal signaling dynamic, which I resent. I know that there are other people like me because I have had this conversation with several other rationalists in person and we are uniformly annoyed yet “nerd-sniped” by this situation, yet none of us want to say anything because we don’t want to get involved at all. Again, trust me or don’t.
This community is much bigger and more important than 15 or so high-drama high-disagreeability who live in the Bay Area and at this point I feel like those people need to spend less time posting about their social group and more time posting about rationality and stuff.
I don’t agree with the characterization of this topic as self-obsessed community gossip. For context, I’m quite new and don’t have a dog in the fight. But I drew memorable conclusions from this that I couldn’t have gotten from more traditional posts
First, experimenting with our own psychology is tempting and really dangerous. Next time, I’d turn up the caution dial way higher than Leverage did
Second, a lot of us (probably including me) have an exploitable weakness brought on high scrupulously combined with openness to crazy-sounding ideas. Next time, I’d be more cautious (but not too cautious!) about proposals like joining Leverage
Third, if we ever need to maintain the public’s goodwill, I’ll try not to use words like “demonic seance”… even if I don’t mean it literally
In short, this is the sort of mistake worth learning about, including for those not personally affected, because it’s the kind of mistake we could plausibly make again. I think it’s useful to have here, and the right attitude for the investigation is “what do these events teach us about how rationalist groups can go wrong?” I also don’t think posting a summary would’ve been sufficient. It was necessary to hear Geoff and Anna’s exact words
In fact, what I’d really like to see from this is Leverage and CFAR’s actual research, including negative results
What experiments did they try? Is there anything true and surprising that came out of this? What dead ends did they discover (plus the evidence that these are truly dead ends)?
It’d be especially interesting if someone annotated Geoff’s giant agenda flowchart with what they were thinking at the time and what, if anything, they actually tried
Also interested in the root causes of the harms that came to Zoe et al. Is this an inevitable consequence of Leverage’s beliefs? Or do the particular beliefs not really matter, and it’s really about the social dynamics in their group house?
Probably not what you wanted, but you can read CFAR’s handbook and updates (where they also reflect on some screwups). I am not aware of Leverage having anything equivalent publicly available.
I appreciate you sharing your perspective. A lot of this is uninteresting and irrelevant to perhaps the majority of readers (though I think that as you weight users by karma you’d start to find for more and more of them this is directly about the social dynamics around them).
I’m pretty pro this discussion happening somehow for the communities involved, and think it’s been pretty helpful in some important ways for it to happen as it has in public.
I wonder if there’s a natural way for it to be less emphasized for the majority for whom it is uninteresting. Perhaps it should only be accessible to logged-in accounts at the time of posting and then public 6 months later, or perhaps it should be relegated to a part of the site that isn’t the frontpage (note we aren’t frontpaging it, which means at least logged out users aren’t seeing it).
If there‘s a good suggestion here I’d be into that.
I think some of these are pretty reasonable points, but I am kind of confused by the following:
This Leverage drama is not important to anyone except a small group of people and does not belong on LW. Perhaps the relatively small group of Bay Area rationalists who are always at the center of these things need to create a separate forum for their own drama. Nobody outside of Berkeley needs to hear about this. This sort of thing gets upvoted because tribal instincts are being activated, not because this is good and ought to be here.
It seems to me that Leverage had a large and broad effect on the Effective Altruism and Rationality communities worldwide, with having organized the 2013-2014 EA Summits, and having provided a substantial fraction of the strategic direction for EAG 2015 and EAG 2016, and then shared multiple staff with the Centre For Effective Altruism until 2019.
This suggests to me that what happened at Leverage clearly had effects that are much broader reaching than “some relatively small group of Bay Area rationalists”. Indeed, I think the Bay Area rationality community wasn’t that affected by all the stuff happening at Leverage, and the effects seemed much more distributed.
Maybe you also think all the EA Summit and EA Global conferences didn’t matter? Which seems like a fine take. Or maybe you think how CEA leadership worked also didn’t matter, which also seems fine. But I do think these both aren’t obvious takes, and I think I disagree with both of them.
“Problematic dynamics happened at Leverage” and “Leverage influenced EA Summit/Global” don’t imply “Problematic dynamics at Leverage influenced EA Summit/Global” if EA Summit/Global had their own filters against problematic influences. (If such filters failed, it should be possible to point out where.)
I donate a meaningful amount to CFAR and MIRI (without being overly specific, >1% of my income to those two orgs), and check LW weekly-ish, and I had never even heard of Leverage until the recent kerfuffle. Anecdote isn’t data but I sort of agree with this comment’s grandparent here.
It seems to me that Leverage had a large and broad effect on the Effective Altruism and Rationality communities worldwide, with having organized the 2013-2014 EA Summits, and having provided a substantial fraction of the strategic direction for EAG 2015 and EAG 2016, and then shared multiple staff with the Centre For Effective Altruism until 2019.
For me personally this still rounds off to “not very important.” Especially in the sense that there is nothing I, or the vast majority of people on this site, could possibly do with this information. I was already never going to join Leverage, or give any money to Geoff Anders. I have a lot of rationalist friends, both IRL and online, and none of us had ever heard about Geoff Anders prior to this recent drama.
Think about it in terms of cost-benefit. The benefit of this kind of content to the vast majority of people on LW is zero. The cost is pretty high, because ~everybody who sees a big juicy drama fest is going to want to rubberneck and throw in their two cents. So on net posting content like this to the main LW feed is strongly net negative in aggregate. A post which is simply dumb/wrong but otherwise un-dramatic can at least be simply ignored.
I think that if it were, say, Yudkowsky being accused of auditing people’s thetans and having seances, I would find that relevant, because it would have implications for my future decisions.
A few things.
I’m a high-karma LW member and I created an anonymous account to say this for reasons given below. Trust me on that or don’t, my arguments should stand on their own.
Way too much of this kind of self-obsessed community gossip has dominated LW in recent weeks. This stuff demands highly disproportionate attention and has turned LW into a net negative place for me to spend time on.
This Leverage drama is not important to anyone except a small group of people and does not belong on LW. Perhaps the relatively small group of Bay Area rationalists who are always at the center of these things need to create a separate forum for their own drama. Nobody outside of Berkeley needs to hear about this. This sort of thing gets upvoted because tribal instincts are being activated, not because this is good and ought to be here.
I have a much lower opinion about literally everybody even tangentially involved in this whole thing, even Anna Salamon for making the extremely bad PR choice of getting herself and her organization sucked into a completely avoidable vortex of bad publicity. At this point I am not sure that CFAR has created any value at all in recent years, all I know is that there are some vague and impossible to pin down connections to some extremely terrible-sounding people and situations. This is intended mostly as a statement, from an uninvolved bystander, about how bad the optics are here, and how much it’s negatively impacted my own subjective impression of CFAR and the Bay Area rationality community at large.
If you disagree with the above and really really feel like you need to post a top-level post about some kind of community drama, then please at least try to do a good job on it. Separately from the frequency of these posts is the issue of quality and volume. Duncan Sabien’s multiple recent posts and this incredibly long and time-consuming transcript are extremely low-effort and low-quality, the former is badly written and the later is just a transcript. If you felt like you needed to post this you could have at least provided a short summary of major points so people could determine whether they needed to read it.
You might say “Nobody is making you read it.” That’s true but misses the fact that gossip activates tribal reflexes that are very hard to fight. And anything with 100+ upvotes demands attention. I can’t tell a priori that those upvotes are more about tribal solidarity than about quality and importance. I created an anonymous account because I want to just say this and then be allowed to stop thinking about it, and not get roped into the whole tribal signaling dynamic, which I resent. I know that there are other people like me because I have had this conversation with several other rationalists in person and we are uniformly annoyed yet “nerd-sniped” by this situation, yet none of us want to say anything because we don’t want to get involved at all. Again, trust me or don’t.
This community is much bigger and more important than 15 or so high-drama high-disagreeability who live in the Bay Area and at this point I feel like those people need to spend less time posting about their social group and more time posting about rationality and stuff.
I don’t agree with the characterization of this topic as self-obsessed community gossip. For context, I’m quite new and don’t have a dog in the fight. But I drew memorable conclusions from this that I couldn’t have gotten from more traditional posts
First, experimenting with our own psychology is tempting and really dangerous. Next time, I’d turn up the caution dial way higher than Leverage did
Second, a lot of us (probably including me) have an exploitable weakness brought on high scrupulously combined with openness to crazy-sounding ideas. Next time, I’d be more cautious (but not too cautious!) about proposals like joining Leverage
Third, if we ever need to maintain the public’s goodwill, I’ll try not to use words like “demonic seance”… even if I don’t mean it literally
In short, this is the sort of mistake worth learning about, including for those not personally affected, because it’s the kind of mistake we could plausibly make again. I think it’s useful to have here, and the right attitude for the investigation is “what do these events teach us about how rationalist groups can go wrong?” I also don’t think posting a summary would’ve been sufficient. It was necessary to hear Geoff and Anna’s exact words
In fact, what I’d really like to see from this is Leverage and CFAR’s actual research, including negative results
What experiments did they try? Is there anything true and surprising that came out of this? What dead ends did they discover (plus the evidence that these are truly dead ends)?
It’d be especially interesting if someone annotated Geoff’s giant agenda flowchart with what they were thinking at the time and what, if anything, they actually tried
Also interested in the root causes of the harms that came to Zoe et al. Is this an inevitable consequence of Leverage’s beliefs? Or do the particular beliefs not really matter, and it’s really about the social dynamics in their group house?
Probably not what you wanted, but you can read CFAR’s handbook and updates (where they also reflect on some screwups). I am not aware of Leverage having anything equivalent publicly available.
I appreciate you sharing your perspective. A lot of this is uninteresting and irrelevant to perhaps the majority of readers (though I think that as you weight users by karma you’d start to find for more and more of them this is directly about the social dynamics around them).
I’m pretty pro this discussion happening somehow for the communities involved, and think it’s been pretty helpful in some important ways for it to happen as it has in public.
I wonder if there’s a natural way for it to be less emphasized for the majority for whom it is uninteresting. Perhaps it should only be accessible to logged-in accounts at the time of posting and then public 6 months later, or perhaps it should be relegated to a part of the site that isn’t the frontpage (note we aren’t frontpaging it, which means at least logged out users aren’t seeing it).
If there‘s a good suggestion here I’d be into that.
I think some of these are pretty reasonable points, but I am kind of confused by the following:
It seems to me that Leverage had a large and broad effect on the Effective Altruism and Rationality communities worldwide, with having organized the 2013-2014 EA Summits, and having provided a substantial fraction of the strategic direction for EAG 2015 and EAG 2016, and then shared multiple staff with the Centre For Effective Altruism until 2019.
This suggests to me that what happened at Leverage clearly had effects that are much broader reaching than “some relatively small group of Bay Area rationalists”. Indeed, I think the Bay Area rationality community wasn’t that affected by all the stuff happening at Leverage, and the effects seemed much more distributed.
Maybe you also think all the EA Summit and EA Global conferences didn’t matter? Which seems like a fine take. Or maybe you think how CEA leadership worked also didn’t matter, which also seems fine. But I do think these both aren’t obvious takes, and I think I disagree with both of them.
“Problematic dynamics happened at Leverage” and “Leverage influenced EA Summit/Global” don’t imply “Problematic dynamics at Leverage influenced EA Summit/Global” if EA Summit/Global had their own filters against problematic influences. (If such filters failed, it should be possible to point out where.)
I donate a meaningful amount to CFAR and MIRI (without being overly specific, >1% of my income to those two orgs), and check LW weekly-ish, and I had never even heard of Leverage until the recent kerfuffle. Anecdote isn’t data but I sort of agree with this comment’s grandparent here.
For me personally this still rounds off to “not very important.” Especially in the sense that there is nothing I, or the vast majority of people on this site, could possibly do with this information. I was already never going to join Leverage, or give any money to Geoff Anders. I have a lot of rationalist friends, both IRL and online, and none of us had ever heard about Geoff Anders prior to this recent drama.
Think about it in terms of cost-benefit. The benefit of this kind of content to the vast majority of people on LW is zero. The cost is pretty high, because ~everybody who sees a big juicy drama fest is going to want to rubberneck and throw in their two cents. So on net posting content like this to the main LW feed is strongly net negative in aggregate. A post which is simply dumb/wrong but otherwise un-dramatic can at least be simply ignored.
I think that if it were, say, Yudkowsky being accused of auditing people’s thetans and having seances, I would find that relevant, because it would have implications for my future decisions.
What do you think of Anna’s https://www.lesswrong.com/posts/SWxnP5LZeJzuT3ccd/pr-is-corrosive-reputation-is-not ? (I don’t know that I fully understand her view in that post, but it seems like a fruitful place to look for cruxes, given how much you talk about “PR” ad “optics” here.)
=P
<3
Sorry. I was in a really shitty mood. That wasn’t nice of me.
<3
I will note that I think it’s completely valid to hold each of the following views:
My recent stuff is badly written
My recent stuff is on a topic we should spend less time on
My recent stuff made things worse
… I, like, hope those things aren’t true, but they are worthwhile hypotheses.