Thanks for the interesting critique. I agree with you that EAs often make over-confident claims without solid evidence, although I don’t think it’s a huge issue that people sometimes understate how much it costs to save a life, as even the most pessimistic realistic estimates of this cost don’t undermine the case for donating significant sums to cost-effective charities.
Am I right in understanding that you think that too many EAs are pursuing earning to give careers in finance and technology, whereas you think they’d have greater impact if they worked in start-ups? If so, could you provide some more explanation of why you think this? It seems plausible to me that earning to give is one of the highest-impact career options for many EAs, given the enormous amount of good that donations to the most effective charities can do.
Finally, you say you “worry that effective altruists may actually be less effective than “normal” altruists”. That’s a pretty striking claim! Can you expand on it a little? In particular, could you give a typical example of ‘normal’ altruism, and explain why you think it might be more effective than pursuing an earning to give career and donating large sums to a charity like SCI?
In particular, could you give a typical example of ‘normal’ altruism
I gave the Simons Foundation as an example in my essay. Among other things, they fund the arXiv, which already seems to me to be an extremely valuable contribution. Granted, Simons made huge amounts of money as a quant, but as far as I know he isn’t explicitly an EA, and he certainly wasn’t “earning to give” in the conventional sense of just giving to top GiveWell charities.
I gave the Simons Foundation as an example in my essay. Among other things, they fund the arXiv, which already seems to me to be an extremely valuable contribution.
Thanks—I agree that it’s not prima facie absurd that that donation did more good than an equivalent amount of money to AMF would have done. However it seems significantly better than a typical example of normal altruism, which I’d think of as being something like a donation to a moderately effective domestic poverty charity.
I don’t think it’s fair to compare to a “typical example” of normal altruism, because most people who donate do not put much serious thought into what they’re going to do with their money. I think the fair comparison would be to altruists who are non-EAs but put comparable amounts of thought into what they do. At that point it’s not clear to me that EAs are doing better (especially if we look at the reference class of people who are willing to devote their entire career to a cause).
Of course, I agree that it would be good if as a society we shifted the cultural default for how to donate to be something more effective (e.g. “known to be effective charity” instead of “random reasonable-looking domestic charity”). This is one good thing that I see the EA movement accomplishing and hope that it will continue to accomplish.
OK, I see where you’re coming from, and you have a good point (though you might want to consider adjusting the phrasing of the claim in your original post, which as I said came across as very strong).
Thanks for the interesting critique. I agree with you that EAs often make over-confident claims without solid evidence, although I don’t think it’s a huge issue that people sometimes understate how much it costs to save a life, as even the most pessimistic realistic estimates of this cost don’t undermine the case for donating significant sums to cost-effective charities.
Am I right in understanding that you think that too many EAs are pursuing earning to give careers in finance and technology, whereas you think they’d have greater impact if they worked in start-ups? If so, could you provide some more explanation of why you think this? It seems plausible to me that earning to give is one of the highest-impact career options for many EAs, given the enormous amount of good that donations to the most effective charities can do.
Finally, you say you “worry that effective altruists may actually be less effective than “normal” altruists”. That’s a pretty striking claim! Can you expand on it a little? In particular, could you give a typical example of ‘normal’ altruism, and explain why you think it might be more effective than pursuing an earning to give career and donating large sums to a charity like SCI?
I gave the Simons Foundation as an example in my essay. Among other things, they fund the arXiv, which already seems to me to be an extremely valuable contribution. Granted, Simons made huge amounts of money as a quant, but as far as I know he isn’t explicitly an EA, and he certainly wasn’t “earning to give” in the conventional sense of just giving to top GiveWell charities.
Thanks—I agree that it’s not prima facie absurd that that donation did more good than an equivalent amount of money to AMF would have done. However it seems significantly better than a typical example of normal altruism, which I’d think of as being something like a donation to a moderately effective domestic poverty charity.
I don’t think it’s fair to compare to a “typical example” of normal altruism, because most people who donate do not put much serious thought into what they’re going to do with their money. I think the fair comparison would be to altruists who are non-EAs but put comparable amounts of thought into what they do. At that point it’s not clear to me that EAs are doing better (especially if we look at the reference class of people who are willing to devote their entire career to a cause).
Of course, I agree that it would be good if as a society we shifted the cultural default for how to donate to be something more effective (e.g. “known to be effective charity” instead of “random reasonable-looking domestic charity”). This is one good thing that I see the EA movement accomplishing and hope that it will continue to accomplish.
OK, I see where you’re coming from, and you have a good point (though you might want to consider adjusting the phrasing of the claim in your original post, which as I said came across as very strong).