Thanks for responding. I’ve responded to points below.
Poor cause choices
There’s a lot being done on this front: GiveWell is running Labs, and Holden has said he expects to find better donation opportunities in the next few years outside of global health CEA is an advocate of further cause prioritisation research, and is about to hire Owen Cotton-Barratt, to work full-time on it. * 80k is about to release a list of recommended causes, which will not have global health at the top.
The point of this argument wasn’t that organizations aren’t working on it. In fact the existence of this research strengthens my point, which was that people are donating now anyway despite the fact that it looks like we know very little now and the attitude towards giving now vs. later seems to be “well there’s a good case for either one” rather than “we really need to figure this out because we may be pouring money down the drain”, which is evidence that people are stopping thinking at the level of “doesn’t obviously conflict with EA principles”.
Inconsistent attitude toward rigor
I think this is mainly because people use the best analysis that’s out there, and the best analysis for charity is currently much more in-depth than it is for these other issues. We’re trying to make progress on the other issues at 80k and CEA.
Again, the issue isn’t that nobody is trying to solve these, it’s that most people are way more worried about the charity analysis issue than ancillary issues that are just as important. If our knowledge of e.g. cost-effectiveness of global health interventions was as limited as our knowledge elsewhere, would people be donating to global health charities? I doubt it.
Poor psychological understanding
My impression is that people at CEA have worried about these problems quite a bit. At 80k, we try to work on this problem by highlighting members who are really trying rather than rationalising what they want, which we hope will encourage good norms. We’ll also consider calling people out, but it can be a delicate issue!
I’ve been following 80k and have not noticed this phenomenon. Can you give some examples?
Monoculture
I’m worried about this, but it’s difficult to change. All we can do is try to make an active effort to reach out to new groups.
This is definitely not all we can do (unless you take a tautologically broad interpretation of “make an active effort to reach out”). For instance, if a substantial fraction of effective altruists were raging sexists, it would be wise to fix our group norms before going “hey women! there’s this thing called effective altruism!”
Even supposing it is all we can do, is there anything we’re actually doing about it?
EA was started by some of the smartest, most well meaning people I have ever met. It’s going to be almost impossible to avoid a decline in quality of discussion as the circle is widened.
The point of the critique was not to list easily avoidable problems, but to list bad problems. If decline in quality of people is inevitable, then we better find some solutions to the problems it brings (e.g. epistemic inertia), or the decline of EA is inevitable too.
Read the response to poor cause choice and inconsistent attitude toward rigor as “while some EAs might be donating without enough thought, lots of others are investing most of their resources in doing more research”
The monoculture problem is something we often think about how to fix at 80k. We haven’t come up with great solutions yet though.
I also argued that the decline in the FB group is not obviously important.
And if it’s difficult to avoid, but many movements started by a small group of smart people nevertheless go on to achieve a lot, that’s also evidence that it’s not important.
Hi Ben,
Thanks for responding. I’ve responded to points below.
The point of this argument wasn’t that organizations aren’t working on it. In fact the existence of this research strengthens my point, which was that people are donating now anyway despite the fact that it looks like we know very little now and the attitude towards giving now vs. later seems to be “well there’s a good case for either one” rather than “we really need to figure this out because we may be pouring money down the drain”, which is evidence that people are stopping thinking at the level of “doesn’t obviously conflict with EA principles”.
Again, the issue isn’t that nobody is trying to solve these, it’s that most people are way more worried about the charity analysis issue than ancillary issues that are just as important. If our knowledge of e.g. cost-effectiveness of global health interventions was as limited as our knowledge elsewhere, would people be donating to global health charities? I doubt it.
I’ve been following 80k and have not noticed this phenomenon. Can you give some examples?
This is definitely not all we can do (unless you take a tautologically broad interpretation of “make an active effort to reach out”). For instance, if a substantial fraction of effective altruists were raging sexists, it would be wise to fix our group norms before going “hey women! there’s this thing called effective altruism!”
Even supposing it is all we can do, is there anything we’re actually doing about it?
The point of the critique was not to list easily avoidable problems, but to list bad problems. If decline in quality of people is inevitable, then we better find some solutions to the problems it brings (e.g. epistemic inertia), or the decline of EA is inevitable too.
Read the response to poor cause choice and inconsistent attitude toward rigor as “while some EAs might be donating without enough thought, lots of others are investing most of their resources in doing more research”
The monoculture problem is something we often think about how to fix at 80k. We haven’t come up with great solutions yet though.
I also argued that the decline in the FB group is not obviously important. And if it’s difficult to avoid, but many movements started by a small group of smart people nevertheless go on to achieve a lot, that’s also evidence that it’s not important.