Today I learned that our friends at RationalWiki dislike effective altruism, to put it mildly. As David Gerard himself says, “it is neither altruistic, nor effective”.
In section Where “Effective Altruists” actually send their money, the main complaint seems to be that among (I assume) respectable causes such as fighting diseases and giving money to poor people, effective altruists also support x-risk organisations, veganism, and meta organisations… or, using the language of RationalWiki, “sending money to Eliezer Yudkowsky”, “feeling bad when people eat hamburgers”, and “complaining when people try to solve local problems”.
Briefly looking at numbers of donors in the surveys and trying to group the charities into categories (chances are I misclassified something), it seems like disease charities got 211+114+43+16=384, poverty charities 101, Yudkowsky charities 77+45=122, meta charities 46+21+14+10+10=101, animal charities 27+22=49, and Leverage 7 donors. So even if you think that only disease charities and poverty charities are truly altruistic, it would still be 63% of donors giving money to truly altruistic charities. Uhm, could be worse, I guess.
Also, this is a weird complaint:
GiveWell has also recommended that people spam the Against Malaria Foundation (AMF) with all (except if they are billionaires, obviously) the money they have set aside to donate, on the grounds that they think it’s the best charity, even at the risk of exhausting the AMF’s room for more funding, amongst other dubious decisions.
Like, without any evidence that AMF’s room for funding was actually exhausted, this all reduces to: “we hate EAs because they do not send money to best charities, and also because they send them more money than they can handle”. But sneering was never supposed to be consistent, I guess.
There are over 100 edits in this article. Many, especially of the large ones are made by David Gerard, but there is also Greenrd and others.
It would be nice to have better tools for exploring wiki history, for example, if I could select a sentence or two, and get a history of this specific sentence, like only the edits that modified it, and preferably get all the historical versions of that sentence on a single page along with the user names and links to edits, so that I do not need to click on each edit separately and look for the sentence.
It is also interesting to compare Wikipedia and RationalWiki articles on the same topic.
Wikipedia narrative is that EA is a high-status “philosophical and social movement” responsible for over $400 000 000 donations in 2019, based on principles of “impartiality, cause neutrality, cost-effectiveness, and counterfactual reasoning”, and its prominent causes are “global poverty, animal welfare, and risks to the survival of humanity over the long-term future”.
Rationalist community is mentioned briefly:
A related group that attracts some effective altruists is the rationalist community.
In addition, the Machine Intelligence Research Institute is focused on the more narrow mission of managing advanced artificial intelligence.
Other contributions were [...] the creation of internet forums such as LessWrong.
Furthermore, Machine Intelligence Research Institute is included in the “Effective Altruism” infobox at the bottom of the page. Mention of Eliezer Yudkowsky was removed as not properly sourced (fair point, I guess). The Wikiquote page on EA quotes Scott Alexander and Eliezer Yudkowsky.
RationalWiki narrative is that “The philosophical underpinnings mostly come from philosopher Peter Singer [but] This did not start the effective altruism subculture”. “The effective altruism subculture — as opposed to the concept of altruism that is effective — originated around LessWrong” “The ideas have been around a while, but the current subculture that calls itself Effective Altruism got a big push from MIRI and its friends in the LessWrong community”, but the problem is that rationalists believed that MIRI is an effective charity, which is a form of Pascal’s Mugging.
“effective altruists currently tend to think that the most important causes to focus on are global poverty, factory farming, and the long-term future of life on Earth. In practice, this amounts to complaining when people try to solve local problems, feeling bad when people eat hamburgers, and sending money to Eliezer Yudkowsky, respectively.”
...so, my impression is that according to Wikipedia, EA is high-status and mostly unrelated to the rationalist community; and according to RationalWiki, EA was effectively started by rationalist community and is low-status.
Today I learned that our friends at RationalWiki dislike effective altruism, to put it mildly. As David Gerard himself says, “it is neither altruistic, nor effective”.
In section Where “Effective Altruists” actually send their money, the main complaint seems to be that among (I assume) respectable causes such as fighting diseases and giving money to poor people, effective altruists also support x-risk organisations, veganism, and meta organisations… or, using the language of RationalWiki, “sending money to Eliezer Yudkowsky”, “feeling bad when people eat hamburgers”, and “complaining when people try to solve local problems”.
Briefly looking at numbers of donors in the surveys and trying to group the charities into categories (chances are I misclassified something), it seems like disease charities got 211+114+43+16=384, poverty charities 101, Yudkowsky charities 77+45=122, meta charities 46+21+14+10+10=101, animal charities 27+22=49, and Leverage 7 donors. So even if you think that only disease charities and poverty charities are truly altruistic, it would still be 63% of donors giving money to truly altruistic charities. Uhm, could be worse, I guess.
Also, this is a weird complaint:
Like, without any evidence that AMF’s room for funding was actually exhausted, this all reduces to: “we hate EAs because they do not send money to best charities, and also because they send them more money than they can handle”. But sneering was never supposed to be consistent, I guess.
One would also think that the ‘risk’ of ‘exhausting the AMF’s room for more funding’ would be something to celebrate.
Is RationalWiki still mostly “David Gerrard’s Thoughts and Notes”? This kind of writeup shouldn’t come as a surprise.
There are over 100 edits in this article. Many, especially of the large ones are made by David Gerard, but there is also Greenrd and others.
It would be nice to have better tools for exploring wiki history, for example, if I could select a sentence or two, and get a history of this specific sentence, like only the edits that modified it, and preferably get all the historical versions of that sentence on a single page along with the user names and links to edits, so that I do not need to click on each edit separately and look for the sentence.
It is also interesting to compare Wikipedia and RationalWiki articles on the same topic.
Wikipedia narrative is that EA is a high-status “philosophical and social movement” responsible for over $400 000 000 donations in 2019, based on principles of “impartiality, cause neutrality, cost-effectiveness, and counterfactual reasoning”, and its prominent causes are “global poverty, animal welfare, and risks to the survival of humanity over the long-term future”.
Rationalist community is mentioned briefly:
A related group that attracts some effective altruists is the rationalist community.
In addition, the Machine Intelligence Research Institute is focused on the more narrow mission of managing advanced artificial intelligence.
Other contributions were [...] the creation of internet forums such as LessWrong.
Furthermore, Machine Intelligence Research Institute is included in the “Effective Altruism” infobox at the bottom of the page. Mention of Eliezer Yudkowsky was removed as not properly sourced (fair point, I guess). The Wikiquote page on EA quotes Scott Alexander and Eliezer Yudkowsky.
RationalWiki narrative is that “The philosophical underpinnings mostly come from philosopher Peter Singer [but] This did not start the effective altruism subculture”. “The effective altruism subculture — as opposed to the concept of altruism that is effective — originated around LessWrong” “The ideas have been around a while, but the current subculture that calls itself Effective Altruism got a big push from MIRI and its friends in the LessWrong community”, but the problem is that rationalists believed that MIRI is an effective charity, which is a form of Pascal’s Mugging.
“effective altruists currently tend to think that the most important causes to focus on are global poverty, factory farming, and the long-term future of life on Earth. In practice, this amounts to complaining when people try to solve local problems, feeling bad when people eat hamburgers, and sending money to Eliezer Yudkowsky, respectively.”
...so, my impression is that according to Wikipedia, EA is high-status and mostly unrelated to the rationalist community; and according to RationalWiki, EA was effectively started by rationalist community and is low-status.