To an individual, perhaps; but there are almost certainly people out there who think rationality is important but don’t think FAI is important, and thus would be willing to donate to the rationality group but not to SIAI.
While I like the idea of FAI, I’m unconvinced that AGI is an existential threat in the next two or three human generations; but I’m confident that raising the sanity waterline will be of help in dealing with any existential risks, including AGI. Moreover, people who have differing beliefs on x-risk should be able to agree that teaching rationality is of common interest to their concerns.
Here on LW, we know that if you want to do the most good, you shouldn’t diversify your charitable giving.
If this is so, then why is the Singularity Institute spinning off a separate rationality org? Shouldn’t one of rationality or FAI be more important?
To an individual, perhaps; but there are almost certainly people out there who think rationality is important but don’t think FAI is important, and thus would be willing to donate to the rationality group but not to SIAI.
While I like the idea of FAI, I’m unconvinced that AGI is an existential threat in the next two or three human generations; but I’m confident that raising the sanity waterline will be of help in dealing with any existential risks, including AGI. Moreover, people who have differing beliefs on x-risk should be able to agree that teaching rationality is of common interest to their concerns.
Diminishing returns from either individual activity may be important on that scale.
I think the rationality spinoff is, perhaps among other things, going to run non-free workshops that will be funded by noncharitable dollars.
OK, that sounds like a pretty good reason.