I wouldn’t describe many of them [my non-believer friends] as rationalists, particularly, but it seems that according to lesswrong doctrine, they are above the sanity waterline while my first friend group is below.
Am I the only one who thinks that this “sanity waterline” model is wildly inaccurate? The model assumes that false beliefs can be somehow ordered by the level of insanity, so that people who have achieved a given level of sanity are immunized against everything below that. This, however, seems to me completely remote from reality. Even if we can agree on some standardized “insanity ranking” for false beliefs—already an unrealistic assumption—there’s no way people’s actual sets of beliefs will conform to the “waterline” rule according to this ranking, not even as the roughest first approximation.
One essential reason for this is the signaling role of beliefs. When it comes to issues that don’t have significant instrumental implications, people are drawn to beliefs with the highest signaling value rather than accuracy. High-status beliefs can be anywhere from completely correct to downright crazy, and outside of strictly technical topics, there is typically nothing that would systematically push them towards the former. And indeed, in practice we usually see people with an eclectic mix of correct and ridiculously false beliefs (and everything in-between), with nothing resembling a systematic “sanity waterline.”
If you read the original post by Eliezer, he uses the “sanity waterline” concept only as a loose metaphor. But since then the phrase itself seems to have taken on a life of its own.
If you read the original post by Eliezer, he uses the “sanity waterline” concept only as a loose metaphor. But since then the phrase itself seems to have taken on a life of its own.
Yes, I’ve read the original post. In my opinion, its main point is incorrect and it’s illusory to talk about a “sanity waterline” even as a loose metaphor.
If a belief is high-status and without personal harmful effects for those who hold it, it doesn’t matter at all how much it violates the basic principles of sound epistemology, logic, etc. -- even the smartest people will be drawn to it like iron dust to a magnet. Now, Eliezer looks at those few Nobel-winning scientists who adhere seriously to some traditional religion and sees them as an especially extreme example of irrationality that uniquely sticks out. Whereas in reality, the reason they stick out is not that they are especially irrational by some objective measure, but that they don’t conform to the prevailing status dynamic: traditional religion has been low-status among the intellectual elites for quite some time now. Among the beliefs that are high-status and shared by a whole lot of scientists—let alone the rest of the intellectual elite—many are, in my opinion, far more irrational, but don’t stick out simply because it’s considered a normal or even expected thing for a high-status person to believe. [1]
In that post, Eliezer almost reached the correct insight when he remarked how weird it would be for someone who believes in Santa Claus to win a Nobel prize. What he should have asked at that point is whether there are beliefs that are equally irrational but their status relative to traditional religion is presently as high as the status of traditional religion relative to belief in Santa Claus. (Not an exact comparison, but you get the idea. Plus I also disagree that the latter two are comparable, but that’s a complex topic in its own right.)
[1] Here, of course, the argument stumbles onto a catch-22 situation, since giving any concrete examples of such beliefs is guaranteed to be a highly controversial assertion, and not giving any sounds like empty talk.
Try applying the metaphor on the level of cultures rather than individuals. Cultures can vary on how many irrational ideas they tolerate or promote, how much they expect their members to be able to reflect and justify their beliefs, and exactly what kind of beliefs are useful to signal within them. This doesn’t require a precise ranking of the degrees of insanity of beliefs.
During the cold war, would you say the USSR or the USA had a higher sanity waterline? Keep in mind that the USSR was an atheist state, while the USA had (and still) has a very religious culture.
I actually can’t say, not really having a sufficiently good grasp of the cultural history of either country during the 20th century. I’d rank both as obviously better than most cultures that have existed during human history, but still with widespread obviously weird and counterproductive stuff. Both were sane enough at the commanding level to stick with the game theory of not nuking each other.
The USSR was nominally atheist, but my understanding, based on the testimony of my fencing coach, who grew up in Soviet Russia, is that levels of religious belief were, if not as high as in America, still substantial, and simply practiced in private.
Actually, I said it wasn’t atheist AT ALL. About as atheist as Don Quixote was a knight. Even the atheism was a manifestation of the personality cult of Karl Marx.
Hold on, where did he say that? He said that they were cultish communists, but (as the OP points out), it’s quite possible to be rational about one thing and irrational about another, even though the same strategy would correctly guide you through both issues if you only followed it consistently.
I’m not sure I understand your comment. How exactly would you apply the “waterline” metaphor to comparison between cultures? If anything, it seems to me even less applicable, since there are even more degrees of freedom involved.
This is more of a vague intuition than something built up to withstand prolonged prodding, but individuals have very complex and idiosyncratic complexes of beliefs in their heads, while the beliefs that operate on the level of society get simplified into something that can be transmitted through language. The sets of professed beliefs in a culture need to be put into language, and are then much easier to subject to analysis than whichever thorny messes people are carrying in their heads.
As an example, saying that the sanity waterline is higher in health care in contemporary France than it was in health care in 15th century France seems to me to be saying something meaningful and probably true, but it’s a lot less obvious exactly how you’d go talking about the sanity waterlines of individual French physicians.
The sets of professed beliefs in a culture need to be put into language, and are then much easier to subject to analysis than whichever thorny messes people are carrying in their heads.
I don’t think that’s the case. It seems to me that a very significant part of what people learn from the surrounding culture is not communicated explicitly. To take an important example, some of the essential skills for navigating through the reigning social norms are taboo to discuss explicitly, and you are expected to figure them out by tuning into subtle and implicit aspects of what you see and hear. I’m sure lots of society-wide beliefs are formed by such processes that don’t involve any explicit verbal formulation.
As an example, saying that the sanity waterline is higher in health care in contemporary France than it was in health care in 15th century France seems to me to be saying something meaningful and probably true,
I agree it can make sense when it comes to particular areas of knowledge, especially those that are technical or hard-scientific. But one would be hard pressed to find many examples when an all-encompassing comparison would make sense.
I don’t think that’s the case. It seems to me that a very significant part of what people learn from the surrounding culture is not communicated explicitly.
Yeah, I went and oversimplified there. Though I wonder if it still says something about the complexity of the patterns that the majority of the members of the culture can be expected to internalize them quite fully. It seems like there should be some low common denominator elements going on there compared to what goes on with peoples’ internal mental states.
I agree it can make sense when it comes to particular areas of knowledge, especially those that are technical or hard-scientific. But one would be hard pressed to find many examples when an all-encompassing comparison would make sense.
I thought about the Medieval Europe versus present-day Europe comparison, but Medieval Europe was probably reasonably sane when it came to agriculture or warfare. One big difference I can think of is how the two would react to a significant change in their circumstances. Present-day culture still isn’t quite good at responding to change in the most beneficial way, but Medieval culture could get away with not even considering that things might change in a big way and people would need to reassess how they go about dealing with things.
As far as I can tell, the sanity waterline idea emerged from the frustration of people consistently reacting to certain types of new information in a stereotypical and easily refutable way, and in a higher sanity waterline culture, people would have better cached thoughts to match with useful new ideas or would feel obliged to recognize when something should be given more thought before opining about it.
Swimmer963:
Am I the only one who thinks that this “sanity waterline” model is wildly inaccurate? The model assumes that false beliefs can be somehow ordered by the level of insanity, so that people who have achieved a given level of sanity are immunized against everything below that. This, however, seems to me completely remote from reality. Even if we can agree on some standardized “insanity ranking” for false beliefs—already an unrealistic assumption—there’s no way people’s actual sets of beliefs will conform to the “waterline” rule according to this ranking, not even as the roughest first approximation.
One essential reason for this is the signaling role of beliefs. When it comes to issues that don’t have significant instrumental implications, people are drawn to beliefs with the highest signaling value rather than accuracy. High-status beliefs can be anywhere from completely correct to downright crazy, and outside of strictly technical topics, there is typically nothing that would systematically push them towards the former. And indeed, in practice we usually see people with an eclectic mix of correct and ridiculously false beliefs (and everything in-between), with nothing resembling a systematic “sanity waterline.”
If you read the original post by Eliezer, he uses the “sanity waterline” concept only as a loose metaphor. But since then the phrase itself seems to have taken on a life of its own.
saturn:
Yes, I’ve read the original post. In my opinion, its main point is incorrect and it’s illusory to talk about a “sanity waterline” even as a loose metaphor.
If a belief is high-status and without personal harmful effects for those who hold it, it doesn’t matter at all how much it violates the basic principles of sound epistemology, logic, etc. -- even the smartest people will be drawn to it like iron dust to a magnet. Now, Eliezer looks at those few Nobel-winning scientists who adhere seriously to some traditional religion and sees them as an especially extreme example of irrationality that uniquely sticks out. Whereas in reality, the reason they stick out is not that they are especially irrational by some objective measure, but that they don’t conform to the prevailing status dynamic: traditional religion has been low-status among the intellectual elites for quite some time now. Among the beliefs that are high-status and shared by a whole lot of scientists—let alone the rest of the intellectual elite—many are, in my opinion, far more irrational, but don’t stick out simply because it’s considered a normal or even expected thing for a high-status person to believe. [1]
In that post, Eliezer almost reached the correct insight when he remarked how weird it would be for someone who believes in Santa Claus to win a Nobel prize. What he should have asked at that point is whether there are beliefs that are equally irrational but their status relative to traditional religion is presently as high as the status of traditional religion relative to belief in Santa Claus. (Not an exact comparison, but you get the idea. Plus I also disagree that the latter two are comparable, but that’s a complex topic in its own right.)
[1] Here, of course, the argument stumbles onto a catch-22 situation, since giving any concrete examples of such beliefs is guaranteed to be a highly controversial assertion, and not giving any sounds like empty talk.
Agreed. I used it partly in jest, to try and show that it WAS an unrealistic assumption.
Try applying the metaphor on the level of cultures rather than individuals. Cultures can vary on how many irrational ideas they tolerate or promote, how much they expect their members to be able to reflect and justify their beliefs, and exactly what kind of beliefs are useful to signal within them. This doesn’t require a precise ranking of the degrees of insanity of beliefs.
During the cold war, would you say the USSR or the USA had a higher sanity waterline? Keep in mind that the USSR was an atheist state, while the USA had (and still) has a very religious culture.
I actually can’t say, not really having a sufficiently good grasp of the cultural history of either country during the 20th century. I’d rank both as obviously better than most cultures that have existed during human history, but still with widespread obviously weird and counterproductive stuff. Both were sane enough at the commanding level to stick with the game theory of not nuking each other.
The USSR was nominally atheist, but my understanding, based on the testimony of my fencing coach, who grew up in Soviet Russia, is that levels of religious belief were, if not as high as in America, still substantial, and simply practiced in private.
There was also the issue of communism, which is nothing if not a cult.
So are you trying to say that the USSR wasn’t truly atheist. That sounds like no true scotsman.
Actually, I said it wasn’t atheist AT ALL. About as atheist as Don Quixote was a knight. Even the atheism was a manifestation of the personality cult of Karl Marx.
Hold on, where did he say that? He said that they were cultish communists, but (as the OP points out), it’s quite possible to be rational about one thing and irrational about another, even though the same strategy would correctly guide you through both issues if you only followed it consistently.
I’m not sure I understand your comment. How exactly would you apply the “waterline” metaphor to comparison between cultures? If anything, it seems to me even less applicable, since there are even more degrees of freedom involved.
This is more of a vague intuition than something built up to withstand prolonged prodding, but individuals have very complex and idiosyncratic complexes of beliefs in their heads, while the beliefs that operate on the level of society get simplified into something that can be transmitted through language. The sets of professed beliefs in a culture need to be put into language, and are then much easier to subject to analysis than whichever thorny messes people are carrying in their heads.
As an example, saying that the sanity waterline is higher in health care in contemporary France than it was in health care in 15th century France seems to me to be saying something meaningful and probably true, but it’s a lot less obvious exactly how you’d go talking about the sanity waterlines of individual French physicians.
Risto_Saarelma:
I don’t think that’s the case. It seems to me that a very significant part of what people learn from the surrounding culture is not communicated explicitly. To take an important example, some of the essential skills for navigating through the reigning social norms are taboo to discuss explicitly, and you are expected to figure them out by tuning into subtle and implicit aspects of what you see and hear. I’m sure lots of society-wide beliefs are formed by such processes that don’t involve any explicit verbal formulation.
I agree it can make sense when it comes to particular areas of knowledge, especially those that are technical or hard-scientific. But one would be hard pressed to find many examples when an all-encompassing comparison would make sense.
Yeah, I went and oversimplified there. Though I wonder if it still says something about the complexity of the patterns that the majority of the members of the culture can be expected to internalize them quite fully. It seems like there should be some low common denominator elements going on there compared to what goes on with peoples’ internal mental states.
I thought about the Medieval Europe versus present-day Europe comparison, but Medieval Europe was probably reasonably sane when it came to agriculture or warfare. One big difference I can think of is how the two would react to a significant change in their circumstances. Present-day culture still isn’t quite good at responding to change in the most beneficial way, but Medieval culture could get away with not even considering that things might change in a big way and people would need to reassess how they go about dealing with things.
As far as I can tell, the sanity waterline idea emerged from the frustration of people consistently reacting to certain types of new information in a stereotypical and easily refutable way, and in a higher sanity waterline culture, people would have better cached thoughts to match with useful new ideas or would feel obliged to recognize when something should be given more thought before opining about it.