Eliezer tends to be more forceful on this than I am, though. I seem to be less certain about how much x-risk is purchased by donating to SI as opposed to donating to FHI or GWWC (because GWWC’s members are significantly x-risk focused). But when this video was recorded, FHI wasn’t working as much on AI risk (like it is now), and GWWC barely existed.
I am happy to report that I’m more optimistic about the x-risk reduction purchased per dollar when donating to SI now than I was 6 months ago. Because of stuff like this. We’re getting the org into better shape as quickly as possible.
(Belated reply): I can only offer anecdotal data here, but as one of the members of GWWC, many of the members are interested. Also, listening to the directors, most of them are also interested in x-risk issues.
You are right in that GWWC isn’t a charity (although it is likely to turn into one), and their recommendations are non-x-risk. The rationale for recommending charities is dependent on reliable data: and x-risk is one of those things where a robust “here’s more much more likely happy singularity will be if you give to us” analysis looks very hard.
I feel like I’ve heard this claimed, too, but… where? I can’t find it.
Neither can I but IIRC Anna Salamon did an EU calculation which came up with eight lives saved per dollar donated, no doubt impressively caveated and with error bars aplenty.
I think you’re talking about this video. Without watching it again, I can’t remember if Anna says that SI donation could buy something like eight lives per dollar, or whether donation to x-risk reduction in general could buy something like eight lives per dollar.
I feel like I’ve heard this claimed, too, but… where? I can’t find it.
Here is the latest fundraiser; which line were you thinking of? I don’t see it.
Question #5.
Yup, there it is! Thanks.
Eliezer tends to be more forceful on this than I am, though. I seem to be less certain about how much x-risk is purchased by donating to SI as opposed to donating to FHI or GWWC (because GWWC’s members are significantly x-risk focused). But when this video was recorded, FHI wasn’t working as much on AI risk (like it is now), and GWWC barely existed.
I am happy to report that I’m more optimistic about the x-risk reduction purchased per dollar when donating to SI now than I was 6 months ago. Because of stuff like this. We’re getting the org into better shape as quickly as possible.
Where is this established? As far as I can tell, one cannot donate “to” GWWC, and none of their recommended charities are x-risk focused.
(Belated reply): I can only offer anecdotal data here, but as one of the members of GWWC, many of the members are interested. Also, listening to the directors, most of them are also interested in x-risk issues.
You are right in that GWWC isn’t a charity (although it is likely to turn into one), and their recommendations are non-x-risk. The rationale for recommending charities is dependent on reliable data: and x-risk is one of those things where a robust “here’s more much more likely happy singularity will be if you give to us” analysis looks very hard.
Neither can I but IIRC Anna Salamon did an EU calculation which came up with eight lives saved per dollar donated, no doubt impressively caveated and with error bars aplenty.
I think you’re talking about this video. Without watching it again, I can’t remember if Anna says that SI donation could buy something like eight lives per dollar, or whether donation to x-risk reduction in general could buy something like eight lives per dollar.