Note that people probably tend to end up here by this very process. That is, of all the subcultures available to them, the subculture of people who are interested in
True … but I suspect that people who end up here do so because they basically take more-than-averagely literally the verbally endorsed beliefs of the herd. Rationality as memetic immune disorder, failure to compartmentalize etc.
Perhaps I should amend my original comment to say that if you are cognitively very different from the herd, you may want to use a bit of rationality/self-development like a corrective lens. You’ll have to run compartmentalization in software.
Maybe I should try to start a new trend: use {compartmentalization} when you want to invalidate an inference which most people would not make because of compartmentalization?
E.g. “I think all human lives are equally valuable”
“Then why did you spend $1000 on an ipad rather than giving it to Givewell?”
“I refute it thus: {compartmentalization: nearmode/farmode}”
True … but I suspect that people who end up here do so because they basically take more-than-averagely literally the verbally endorsed beliefs of the herd. Rationality as memetic immune disorder, failure to compartmentalize etc.
Perhaps I should amend my original comment to say that if you are cognitively very different from the herd, you may want to use a bit of rationality/self-development like a corrective lens. You’ll have to run compartmentalization in software.
Maybe I should try to start a new trend: use {compartmentalization} when you want to invalidate an inference which most people would not make because of compartmentalization?
E.g. “I think all human lives are equally valuable”
“Then why did you spend $1000 on an ipad rather than giving it to Givewell?”
“I refute it thus: {compartmentalization: nearmode/farmode}”