Celebrating All Who Are in Effective Altruism

Elitism and Effective Altruism

Many criticize Effective Altruists as elitist. While this criticism is vastly overblown, unfortunately, it does have some basis, not only from the outside looking in but also within the movement itself, including some explicitly arguing for elitism.

Within many EA circles, there are status games and competition around doing “as much as we can,” and in many cases, even judging and shaming, usually implicit and unintended but no less real, of those whom we might term softcore EAs. These are people who identify as EAs and donate money and time to effective charities, but otherwise lead regular lives, as opposed to devoting the brunt of their resources to advance human flourishing as do hardcore EAs. To be clear, there is no definitive and hard distinction between softcore and hardcore EAs, but this is a useful heuristic to employ, as long as we keep in mind that softcore and hardcore are more like poles on a spectrum rather than binary categories.

We should help softcore EAs feel proud of what they do, and beware implying that being softcore EA is somehow deficient or simply the start of an inevitable path to being a hardcore EA. This sort of mentality has caused people I know to feel guilty and ashamed, and led to some leaving the EA movement. Remember that we all suffer from survivorship bias based on seeing those who remained, and not those who left—I specifically talked to people who left, and tried to get their takes on why they did so.

I suggest we aim to respect people wherever they are on the softcore/​hardcore EA spectrum. I propose that, from a consequentialist perspective, negative attitudes toward softcore EAs are counterproductive for doing the most good for the world.

Why We Need Softcore EAs

Even if the individual contributions of softcore EAs are much less than the contributions of individual hardcore EAs, it’s irrational and anti-consequentialist to fail to acknowledge and celebrate the contributions of softcore EAs, and yet that is the status quo for the EA movement. As in any movement, the majority of EAs are not deeply committed activists, but are normal people for whom EA is a valuable but not primary identity category.

All of us were softcore EAs once—if you are a hardcore EA now, envision yourself back in those shoes. How would you have liked to have been treated? Acknowledged and celebrated or pushed to do more and more and more? How many softcore EAs around us are suffering right now due to the pressure of expectations to ratchet up their contributions?

I get it. I myself am driven by powerful emotional urges to reduce human suffering and increase human flourishing. Besides my full-time job as a professor, which takes about ~40 hours per week, I’ve been working ~50-70 hours per week for the last year and a half as the leader of an EA and rationality-themed meta-charity. As all people do, when I don’t pay attention, I fall unthinkingly into the mind projection fallacy, assuming other people think like I do and have my values, as well as my capacity for productivity and impact. I have a knee-jerk pattern as part of my emotional self to identify with and give social status to fellow hardcore EAs, and consider us an in-group, above softcore EAs.

These are natural human tendencies, but destructive ones. From a consequentialist perspective, it weakens our movement and undermines our capacity to build a better world and decrease suffering for current and future humans and other species.

More softcore EAs are vital for the movement itself to succeed. Softcore EAs can help fill talent gaps and donating to effective direct-action charities, having a strong positive impact on the outside world. Within the movement, they support the hardcore EAs emotionally through giving them a sense of belonging, safety, security, and encouragement, which are key for motivation and mental and physical health. Softcore EAs also donate to and volunteer for EA-themed meta-charities, as well as providing advice and feedback, and serving as evangelists of the movement.

Moreover, softcore EAs remind hardcore EAs of the importance of self-care and taking time off for themselves. This is something we hardcore EAs must not ignore! I’m speaking from personal experience here.

Fermi Estimates of Hardcore and Softcore Contributions

If we add up the amount of resources contributed to the movement by softcore EAs, they will likely add up to substantially more than the resources contributed by hardcore EAs. For instance, the large majority of those who took the Giving What We Can and The Life You Can Save pledges are softcore EAs, and so are all the new entrants to the EA movement, by definition.

To attach some numbers to this claim, let’s do a Fermi Estimate that uses some educated guesses to get at the actual resources each group contributes. Say that for every 100 EAs, there are 5 hardcore EAs and 95 softcore EAs. We can describe softcore EAs as contributing anywhere from 1 to 10 percent of their resources to EA causes (this is the range from The Life You Can Save pledge to the Giving What We Can pledge), so let’s guesstimate around 5 percent. Hardcore EAs we can say give an average of 50% of their resources to the movement. Using the handy Guesstimate app, here is a link to a model that shows softcore EAs contribute 480 resources, and hardcore EAs contribute 250 resources per 100 EAs. Now, these are educated guesses, and you can use the model I put together to put in your own numbers for the number of hardcore and softcore EAs per 100 EAs, and also the percent of their resources contributed. In any case, you will find that softcore EAs contribute a substantial amount of resources.

We should also compare the giving of softcore EAs to the giving of members of the general public to get a better grasp on the benefits provided to improving the world by softcore EAs. Let’s say a typical member of the general public contributes 3.5% of her resources to charitable causes, by comparison to 5% for softcore EAs. Being generous, we can estimate that the giving of non-EAs is 100 times less effective than that of EAs. Thus, using the same handy app, here is a link to a model that demonstrates the impact of giving by a typical member of the general public, 3.5, vs. the impact of giving by a softcore EA, 500. Now, the impact of giving by a hardcore EA is going to be higher, of course, 5000 as opposed to 500, but again, we have to remember that there are many more softcore EAs who give resources. You’re welcome to plug in your own numbers to get estimates if you think my suggested figures don’t match your intuitions. Regardless, you can see the high-impact nature of how a typical softcore EA compares to a typical member of the general public.

Effective Altruism, Mental Health, and Burnout: A Personal Account

About two years ago, in February 2014, my wife and I co-founded our meta-charity. In the summer of that year, she suffered a nervous breakdown due to burnout over running the organization. I had to—or to be accurate, chose to—take over both of our roles in managing the nonprofit, assuming the full burden of leadership.

In the Fall of 2014, I myself started to develop a mental disorder from the strain of doing both my professor job and running the organization, while also taking care of my wife. It started with heightened anxiety, which I did not recognize as something abnormal at the time—after all, with the love of my life recovering very slowly from a nervous breakdown and me running the organization, anxiety seemed natural. I was flinching away from my problem, not willing to recognize it and pretending it was fine, until some volunteers at the meta-charity I run – most of them softcore EAs – pointed it out to me.

I started to pay more attention to this, especially as I began to experience fatigue spells and panic attacks. With the encouragement of these volunteers, who essentially pushed me to get professional help, I began to see a therapist and take medication, which I continue to do to this day. I scaled back on the time I put into the nonprofit, from 70 hours per week on average to 50 hours per week. Well, to be honest, I occasionally put in more than 50, as I’m very emotionally motivated to help the world, but I try to restrain myself. The softcore volunteers at the meta-charity I run know about my workaholism and the danger of burnout for me, and remind me to take care of myself. I also need to remind myself constantly that doing good for the world is a marathon and not a sprint, and that in the long run, I will do much more good by taking it easy on myself.

Celebrating Everyone

As a consequentialist, my analysis, along with my personal experience, convince me that the accomplishments of softcore EAs should be celebrated as well as those of hardcore EAs.

So what can we do? We should publicly showcase the importance of softcore EAs. For example, we can encourage publications of articles that give softcore EAs the recognition they deserve, as well as those who give a large portion of their earnings and time to charity. We can invite a softcore EA to speak about her/​his experiences at the 2016 EA Global. We can publish interviews with softcore EAs. Now, I’m not suggesting we should make most speakers softcore EAs, or write most articles, or conduct most interviews with softcore EAs. Overall, my take is that it’s appropriate to celebrate individual EAs proportional to their labors, and as the numbers above show, hardcore EAs individually contribute quite a bit more than softcore EAs. Yet we as a movement need to go against the current norm of not celebrating softcore EAs, and these are just some specific steps that would help us achieve this goal.

Let’s celebrate all who engage in Effective Altruism. Everyone contributes in their own way. Everyone makes the world a better place.

Acknowledgments: For their feedback on draft versions of this post, I want to thank Linch (Linchuan) Zhang, Hunter Glenn, Denis Drescher, Kathy Forth, Scott Weathers, Jay Quigley, Chris Waterguy (Watkins), Ozzie Gooen, Will Kiely, and Jo Duyvestyn. I bear sole responsibility for any oversights and errors remaining in the post, of course.


A different version of this, without the Fermi estimates, was cross-posted on the EA Forum.

EDIT: added link to post explicitly arguing for EA elitism