Eliezer explains it in his other comment (emphasis mine):
Before scanning, I precommit to renouncing, abjuring, and distancing MIRI from the argument in the video if it argues for no probability higher than 1 in 2000 of FAI saving the world, because I myself do not positively engage in long-term projects on the basis of probabilities that low (though I sometimes avoid doing things for dangers that small). There ought to be at least one x-risk effort with a greater probability of saving the world than this—or if not, you ought to make one. If you know yourself for an NPC and that you cannot start such a project yourself, you ought to throw money at anyone launching a new project whose probability of saving the world is not known to be this small.
Eliezer explains it in his other comment (emphasis mine):