Helping CFAR means giving aspiring young rationalists the tools to make thoughtful, well-reasoned decisions not just for their own lives, but for the economic, political and technological future of our society.
Sounds exactly not like “raising the sanity waterline”. It sounds like “we are going to raise the sanity of a relatively small number of people (namely young aspiring rationalists), who will then benefit the broader world (via unspecified mechanisms, but which by implication seem likely to include doing good things on AI safety, or developing other helpful new technologies, etc.)”.
And the rest of what I quoted? And that paragraph in the context of the rest of what I quoted? And everything that @sunwillrise has quoted elsewhere in this subthread? Does none of that sound like “raising the sanity waterline” to you either? (Even the parts where e.g. Ben Pace talks about “massively increas[ing] the sanity waterline on a global scale”?)
No, this:
Sounds exactly not like “raising the sanity waterline”. It sounds like “we are going to raise the sanity of a relatively small number of people (namely young aspiring rationalists), who will then benefit the broader world (via unspecified mechanisms, but which by implication seem likely to include doing good things on AI safety, or developing other helpful new technologies, etc.)”.
And the rest of what I quoted? And that paragraph in the context of the rest of what I quoted? And everything that @sunwillrise has quoted elsewhere in this subthread? Does none of that sound like “raising the sanity waterline” to you either? (Even the parts where e.g. Ben Pace talks about “massively increas[ing] the sanity waterline on a global scale”?)