I agree there’s a depressingly huge majority out there that at the way you describe, but, like, the whole point of having a rationalist community is to train ourselves to deal with that fact, and I think the people I interact with regularly basically have the skill of at least making a serious attempt at confronting deeply held beliefs.
(I certainly think many people in LW circles still struggle with it, especially in domains that are different from the domains they were training on. But, I’ve seen people do at least a half-passable job at it pretty frequently)
I think shminux may have in mind one or more specific topics of contention that he’s had to hash out with multiple LWers in the past (myself included), usually to no avail.
(Admittedly, the one I’m thinking of is deeply, deeply philosophical, to the point where the question “what if I’m wrong about this?” just gets the intuition generator to spew nonsense. But I would say that this is less about an inability to question one’s most deeply held beliefs, and more about the fact that there are certain aspects of our world-models that are still confused, and querying them directly may not lead to any new insight.)
I think you’re selling people way short.
I agree there’s a depressingly huge majority out there that at the way you describe, but, like, the whole point of having a rationalist community is to train ourselves to deal with that fact, and I think the people I interact with regularly basically have the skill of at least making a serious attempt at confronting deeply held beliefs.
(I certainly think many people in LW circles still struggle with it, especially in domains that are different from the domains they were training on. But, I’ve seen people do at least a half-passable job at it pretty frequently)
I think shminux may have in mind one or more specific topics of contention that he’s had to hash out with multiple LWers in the past (myself included), usually to no avail.
(Admittedly, the one I’m thinking of is deeply, deeply philosophical, to the point where the question “what if I’m wrong about this?” just gets the intuition generator to spew nonsense. But I would say that this is less about an inability to question one’s most deeply held beliefs, and more about the fact that there are certain aspects of our world-models that are still confused, and querying them directly may not lead to any new insight.)