The claim made that donating to the SIAI is the charity donation with the highest expected return* always struck me as rather arrogant, though I can see the logic behind it.
The problem is firstly that its an extremely self serving statement, (equivalent to “giving us money is the best thing you can ever possibly do”) even if true its credibility is reduced by the claim coming from the same person who would benefit from it.
Secondly it requires me to believe a number of claims which individually require a burden of proof, and gain more from the conjunction. Including: “Strong AI is possible,” “friendly AI is possible,” “The actions of the SIAI will significantly effect the results of investigations into FAI,” and “the money I donate will significantly improve the effectiveness of the SIAI’s research” (I expect the relationship between research efffectiveness and funding isn’t linear). All of which I only have your word for.
Thirdly, contrast this with other charities who are known to be very effective and can prove it, and whose results affect presently suffering people (e.g. Against malaria).
Caveat, I’m not arguing any of the claims are wrong, but all the arguments I have for it come from people with an incentive in getting me to donate so I have reasonable grounds for questioning the whole construct from outside the argument.
*Can’t remember the exact wording but that was the takeaway of a headline in the last fundraiser.
Eliezer tends to be more forceful on this than I am, though. I seem to be less certain about how much x-risk is purchased by donating to SI as opposed to donating to FHI or GWWC (because GWWC’s members are significantly x-risk focused). But when this video was recorded, FHI wasn’t working as much on AI risk (like it is now), and GWWC barely existed.
I am happy to report that I’m more optimistic about the x-risk reduction purchased per dollar when donating to SI now than I was 6 months ago. Because of stuff like this. We’re getting the org into better shape as quickly as possible.
(Belated reply): I can only offer anecdotal data here, but as one of the members of GWWC, many of the members are interested. Also, listening to the directors, most of them are also interested in x-risk issues.
You are right in that GWWC isn’t a charity (although it is likely to turn into one), and their recommendations are non-x-risk. The rationale for recommending charities is dependent on reliable data: and x-risk is one of those things where a robust “here’s more much more likely happy singularity will be if you give to us” analysis looks very hard.
I feel like I’ve heard this claimed, too, but… where? I can’t find it.
Neither can I but IIRC Anna Salamon did an EU calculation which came up with eight lives saved per dollar donated, no doubt impressively caveated and with error bars aplenty.
I think you’re talking about this video. Without watching it again, I can’t remember if Anna says that SI donation could buy something like eight lives per dollar, or whether donation to x-risk reduction in general could buy something like eight lives per dollar.
The claim made that donating to the SIAI is the charity donation with the highest expected return* always struck me as rather arrogant, though I can see the logic behind it.
The problem is firstly that its an extremely self serving statement, (equivalent to “giving us money is the best thing you can ever possibly do”) even if true its credibility is reduced by the claim coming from the same person who would benefit from it.
Secondly it requires me to believe a number of claims which individually require a burden of proof, and gain more from the conjunction. Including: “Strong AI is possible,” “friendly AI is possible,” “The actions of the SIAI will significantly effect the results of investigations into FAI,” and “the money I donate will significantly improve the effectiveness of the SIAI’s research” (I expect the relationship between research efffectiveness and funding isn’t linear). All of which I only have your word for.
Thirdly, contrast this with other charities who are known to be very effective and can prove it, and whose results affect presently suffering people (e.g. Against malaria).
Caveat, I’m not arguing any of the claims are wrong, but all the arguments I have for it come from people with an incentive in getting me to donate so I have reasonable grounds for questioning the whole construct from outside the argument.
*Can’t remember the exact wording but that was the takeaway of a headline in the last fundraiser.
I feel like I’ve heard this claimed, too, but… where? I can’t find it.
Here is the latest fundraiser; which line were you thinking of? I don’t see it.
Question #5.
Yup, there it is! Thanks.
Eliezer tends to be more forceful on this than I am, though. I seem to be less certain about how much x-risk is purchased by donating to SI as opposed to donating to FHI or GWWC (because GWWC’s members are significantly x-risk focused). But when this video was recorded, FHI wasn’t working as much on AI risk (like it is now), and GWWC barely existed.
I am happy to report that I’m more optimistic about the x-risk reduction purchased per dollar when donating to SI now than I was 6 months ago. Because of stuff like this. We’re getting the org into better shape as quickly as possible.
Where is this established? As far as I can tell, one cannot donate “to” GWWC, and none of their recommended charities are x-risk focused.
(Belated reply): I can only offer anecdotal data here, but as one of the members of GWWC, many of the members are interested. Also, listening to the directors, most of them are also interested in x-risk issues.
You are right in that GWWC isn’t a charity (although it is likely to turn into one), and their recommendations are non-x-risk. The rationale for recommending charities is dependent on reliable data: and x-risk is one of those things where a robust “here’s more much more likely happy singularity will be if you give to us” analysis looks very hard.
Neither can I but IIRC Anna Salamon did an EU calculation which came up with eight lives saved per dollar donated, no doubt impressively caveated and with error bars aplenty.
I think you’re talking about this video. Without watching it again, I can’t remember if Anna says that SI donation could buy something like eight lives per dollar, or whether donation to x-risk reduction in general could buy something like eight lives per dollar.