To an individual, perhaps; but there are almost certainly people out there who think rationality is important but don’t think FAI is important, and thus would be willing to donate to the rationality group but not to SIAI.
While I like the idea of FAI, I’m unconvinced that AGI is an existential threat in the next two or three human generations; but I’m confident that raising the sanity waterline will be of help in dealing with any existential risks, including AGI. Moreover, people who have differing beliefs on x-risk should be able to agree that teaching rationality is of common interest to their concerns.
To an individual, perhaps; but there are almost certainly people out there who think rationality is important but don’t think FAI is important, and thus would be willing to donate to the rationality group but not to SIAI.
While I like the idea of FAI, I’m unconvinced that AGI is an existential threat in the next two or three human generations; but I’m confident that raising the sanity waterline will be of help in dealing with any existential risks, including AGI. Moreover, people who have differing beliefs on x-risk should be able to agree that teaching rationality is of common interest to their concerns.