tl;dr: “Tying the project of developing and promoting an art of rationality to the potentially false claim that AI risk is probably the most important cause area risks distorting this project.” I agree that this is a consideration. I’d like to point out the possibility that acting as if rationality implies less about AI risk being the most important cause area than it actually does also risks distorting (or has already distorted) rationality. My guess is it’s a risk worth taking regardless.
tl;dr: “Tying the project of developing and promoting an art of rationality to the potentially false claim that AI risk is probably the most important cause area risks distorting this project.” I agree that this is a consideration. I’d like to point out the possibility that acting as if rationality implies less about AI risk being the most important cause area than it actually does also risks distorting (or has already distorted) rationality. My guess is it’s a risk worth taking regardless.