On a very large scale, if you think FAI stands a serious chance of saving the world, then humanity should dump a bunch of effort into it, and if nobody’s dumping effort into it then you should dump more effort than currently into it. Calculations of marginal impact in POKO/dollar are sensible for comparing two x-risk mitigation efforts in demand of money, but in this case each marginal added dollar is rightly going to account for a very tiny slice of probability, and this is not Pascal’s Wager. Large efforts with a success-or-failure criterion are rightly, justly, and unavoidably going to end up with small marginal probabilities per added unit effort. It would only be Pascal’s Wager if the whole route-to-humanity-being-OK were assigned a tiny probability, and then a large payoff used to shut down further discussion of whether the next unit of effort should go there or to a different x-risk.
Thanks for answering. I just gave $100 to MIRI.