EA might systematically generate a scarcity mindset that produces low-integrity actors

Epistemic status: Highly speculative quick Facebook post. Thanks to Anna Riedl for nudging me to share it here anyway.

Something I’ve noticed recently is that some people who are in a bad place in their lives tend to have a certain sticky sleazy black holey feel to them. Something around untrustworthiness, low integrity, optimizing for themselves regardless of the cost for the people around them. I’ve met people like that, and I think when others felt around me like my energy was subtly and indescribably off, it was due to me being sticky in that way, too.

Game-theoretically, it makes total sense for people to be a bit untrustworthy while they are in a bad place in their life. If you’re in a place of scarcity, it is entirely reasonable to be strategic about where you put your limited resources. Then, it’s just reasonable to only be loyal to others as long as you can get something out of it yourself and to defect as soon as they don’t offer obvious short-term gains. And similarly, it make sense for the people around you to be a bit weary of you when you are in that place.

And now, a bit of a hot take: I think most if not all of Effective Altruism’s recent scandals have been due to low-integrity sticky behavior. And, I think some properties of EA systematically make people sticky.

We might want to invest some thought and effort into fixing them. So, here’s some of EA’s sticky-people-producing properties I can spontaneously think of, plus first thoughts on how to fix them that aren’t supposed to be final solutions:

1. Utilitarianism

Yudkowsky wrote a thing that I think is true:

“Go three-quarters of the way from deontology to utilitarianism and then stop. You are now in the right place. Stay there at least until you have become a god.”

Meanwhile, SBF and probably a bunch of other people in EA (including me at times) have gone all four quarters of the way. If there’s no upper bound to when it is enough to make the numbers go up, you’ll be in a place of scarcity no matter what, and will be incentivized to defect indefinitely.

I think an explicit belief of “defecting is not a good utilitarian strategy” doesn’t help here: Becoming sticky is not a decision, but a subtle shift in your cognition that happens when your animal instincts pick up that your prefrontal cortex thinks you are in a place of scarcity.

Basically, I think Buddhism is what utilitarianism would be if it made sense and was human-brain-shaped: Optimizing for global optima, but from a place of compassion and felt oneness with all sentient beings, not from the standpoint of a technocratic puppet master.

2. Ever-precarious salaries

EA funders like to base their allocation of funds on evidence, and they like to be able to adjust course quickly as soon as there are higher expected-value opportunities. From the perspective of naive utilitarianism, this completely makes sense.

From the perspective of grantees, however, it feels like permanently having to justify your existence. And that is a situation that makes you go funny in the head in a way that is not conducive to getting just about any job done, unless it’s a job like fraud that inherently involves short-term thinking and defecting on society. Whether or not you treat people as trustworthy and competent, you’ll tend to find that you are right.

I don’t know how to fix this. Especially at the place we are at now, where both the FTX collapse and funders’ increased cautiousness made the precarity of EA funding even worse. Currently, I’m seeing two dimensions to at least partially solving this issue:

  1. Building healthier, more sustainable relationships between community members. That’s why I’m building Authentic Relating Berlin in parallel to EA Berlin, and think about ways to safely(!) encourage memetic exchange between these communities. This doesn’t help with the precarious funding itself, but with the “I feel like I have to justify my existence!”-aspect of writing a grant application.

  2. We might want to fundamentally redesign our institutions so that peopel feel trusted and we elicit trustworthy behavior in them.[1] For example, we might somehow want to offer longer-term financial security to community members that doesn’t just rip off when they want to switch projects within the EA ecosystem. To give people more leeway, and to trust them more to do the best they can with the money they receive. I’ve found some organizations that had awesome success with similar practices in Frederic Laloux’s “Reinventing Organizations”, including a French manufacturing company named FAVI and the Dutch healthcare organization Buurtzorg. Some examples for EA meta work that I think are good progress towards finding forms of organizing that produce trustworthy people are Charity Entrepreneurship, the things Nonlinear builds (e.g. the Nonlinear Network), AI Safety Support, alignment.wiki, the various unconferences I’ve seen happening over the last years, as well as the Future Matters Project, a Berlin-based, EA-adjacent climate movement building org.

3. A not quite well-managed personal/​professional overlap

EA sort of wants to be a professional network. At the same time, the kinds of people who tend to grow interested in EA have a lot of things in common they find few allies for in the rest of the world. So, it’s just obvious that they also want to be friends with each other. Thus grow informal friend circles with opaque entry barriers everywhere around the official professional infrastructure. Thus grow house parties you’ll want to get invited to so you can actually feel part of the tribe, and so you can tap into the informal high-trust networks which actually carry the weight of the professional infrastructure.

Some of the attempts within EA to solve this seem to be to push even more towards just being a professional network. I think that’s dangerously wrong, because it doesn’t remove the informal networks and their power. It just makes access to them harder, and people more desperate to get in.

Plus, humans are social animals, and if you stop them from socialling, they’ll stop showing up.

I think the solution lays in exactly the opposite direction: Creating informal networks with low entry barriers and obvious ways in, so that feeling like you belong to the tribe is not something you have to earn, but something you get for free right at the start of your EA journey. That’s what I’ve been working on with EA Berlin’s communication infrastructure over the last months. Now, I’m trying to figure out how to interface it more graciously with impact-focused outreach and meetups.

  1. ^

    This is the aspect of this post I’m most unsure about.

Crossposted from EA Forum (53 points, 12 comments)