I’ve been really impressed by the focused cross-pollination between transhumanism and rationality that I see at LW. I am not sure I would agree that increased individual rationality is the direct cause of increased cryonics signups because there are other explanations which seem more likely. As others have noted, this is a rare community where it is not weird, and is highly esteemed, to be signed up for cryonics.
And since humans are (at least in many situations) motivated by social factors more than abstract rational considerations, I expect the social factors to have more explanatory weight. That isn’t to say cryonics is not more rational than the alternative of no cryonics! More like this community is one that tries (i.e. individuals are rewarded for trying) to build its standards on rationality, and reject standards which aren’t, and cryonics is able to survive that process. If there were something grossly irrational or unethical about cryonics (as is commonly contended), it would not be able to survive very easily in the memesphere of lesswrong.
But this brings us back to the concept of “advanced” rationality. If you can a) keep your community continually pruned of bad ideas by shooting them down with the strongest logic available (and rewarding this behavior when it crops up), and b) let that community’s norms dominate your decisions when they are strongly rationally grounded, the outcome is that you will be a more rational person in terms of decisions made. This is not less valid from the perspective of “rationality = winning” than divorcing yourself from social impulses and expending loads of willpower to contradict the norm.
This is not less valid from the perspective of “rationality = winning” than divorcing yourself from social impulses and expending loads of willpower to contradict the norm.
It’s more valid! It’s why we have meet-ups, it’s why SingInst runs rationality camps that are highly desired and applied for!
I’ve been really impressed by the focused cross-pollination between transhumanism and rationality that I see at LW. I am not sure I would agree that increased individual rationality is the direct cause of increased cryonics signups because there are other explanations which seem more likely. As others have noted, this is a rare community where it is not weird, and is highly esteemed, to be signed up for cryonics.
And since humans are (at least in many situations) motivated by social factors more than abstract rational considerations, I expect the social factors to have more explanatory weight. That isn’t to say cryonics is not more rational than the alternative of no cryonics! More like this community is one that tries (i.e. individuals are rewarded for trying) to build its standards on rationality, and reject standards which aren’t, and cryonics is able to survive that process. If there were something grossly irrational or unethical about cryonics (as is commonly contended), it would not be able to survive very easily in the memesphere of lesswrong.
But this brings us back to the concept of “advanced” rationality. If you can a) keep your community continually pruned of bad ideas by shooting them down with the strongest logic available (and rewarding this behavior when it crops up), and b) let that community’s norms dominate your decisions when they are strongly rationally grounded, the outcome is that you will be a more rational person in terms of decisions made. This is not less valid from the perspective of “rationality = winning” than divorcing yourself from social impulses and expending loads of willpower to contradict the norm.
It’s more valid! It’s why we have meet-ups, it’s why SingInst runs rationality camps that are highly desired and applied for!
(Yes, I agree with you)