The main problem I see with this post is that it assumes that it’s always advantageous to find out the truth and update one’s beliefs towards greater factual and logical accuracy. Supposedly, the only danger of questioning things too much is that attempts to do so might malfunction and instead move one towards potentially dangerous false beliefs (which I assume is meant by the epithets such as “nutty” and “crazy”).
Yet I find this assumption entirely unwarranted. The benefits of holding false beliefs can be greater than the costs. This typically happens when certain false beliefs have high positive signaling value, but don’t imply any highly costly or dangerous behavior. Questioning and correcting such beliefs can incur far more cost than benefit; one can try to continue feigning them, but for most people it will be at least somewhat difficult and unpleasant. There are also many situations where the discovery of truth can make one’s life miserable for purely personal reasons, and it’s in the best interest of one’s happiness to avoid snooping and questioning things too much.
It seems to me that the problem for uncompromising truth-seekers is not just how to avoid invalid reasoning leading to crazy false beliefs, but also how to avoid forming true beliefs that will have negative signaling consequences or drastically reduce one’s happiness. Now, maybe you would argue that one should always strive for truth no matter what, but this requires a separate argument in addition to what’s presented in the above post—which is by itself insufficient to address the reasons for why people are “afraid to think fully about certain subjects.”
Speaking from experience, avoiding too much thought about true beliefs that negatively impact one’s happiness without giving any value is done by monitoring one’s happiness. Or possibly by working on depression.
For quite some time, my thoughts would keep going back to the idea that your government can kill you at any time (the Holocaust). Your neighbors can kill you at any time. (Rwanda)
Eventually, I noticed that such thoughts were driven by an emotional pull rather than their relevance to anything I wanted or needed.
There’s still some residue—after all, it’s a true thought, and I don’t think I’m just spreading depression to occasionally point out that governments could build UFAI or be a danger to people working on FAI.
Unfortunately, while I remember the process of prying myself loose from that obsession, I don’t remember what might have led to the inspiration to look at those thoughts from the outside.
More generally, I believe there’s an emotional immune system, and it works better for some people than others, at some times than others, and probably (for an individual) about some subjects than others.
The problem with the most poignant examples is that it’s impossible to find beliefs that signal low status and/or disreputability in the modern mainstream society, and are also uncontroversially true. The mention of any concrete belief that is, to the best of my knowledge, both true and disreputable will likely lead to a dispute over whether it’s really true. Yet, claiming that there are no such beliefs at all is a very strong assertion, especially considering that nobody could deny that this would constitute a historically unprecedented state of affairs.
To avoid getting into such disputes, I’ll give only two weaker and (hopefully) uncontroversial examples.
As one example, many people have unrealistic idealized views of some important persons in their lives—their parents, for example, or significant others. If they subject these views to rational scrutiny, and perhaps also embark on fact-finding missions about these persons’ embarrassing past mistakes and personal failings, their new opinions will likely be more accurate, but it may make them much unhappier, and possibly also shatter their relationships, with all sorts of potential awful consequences. This seems like a clear and realistic example where less accurate beliefs are in the best interest of everyone involved.
Or, to take another example, the post mentions people who expend some effort to follow certain forms of religious observance. For many people in various religious and ethnic groups, such behavior produces pleasant feelings of one’s own virtuousness, as well as positive signals to others that one is a committed, virtuous, and respectable member of the community, with all sorts of advantages that follow from that. Now, if such a person scrutinizes the beliefs on which this behavior is based, and concludes that they’re just superstitious nonsense, they will be forced to choose between the onerous and depressing burden of maintaining a dishonest facade or abandoning their observance and facing awful social consequences. I don’t see how this can be possibly seen as beneficial, even though it would mean that their beliefs would become closer to reality.
The problem with the most poignant examples is that it’s impossible to find beliefs that signal low status and/or disreputability in the modern mainstream society, and are also uncontroversially true.
This is a good point. Most ideas that are mistreated by modern mainstream society are not obviously true. Rather, they are treated as much less probable than a less-biased assessment would estimate. This tendency leads to many ideas being given a probability of 0%, when they really deserve a probability of 40-60% based on the current evidence. This is consistent with your experience (and mine) of examining various controversies and being unable to tell which positions are actually correct, based on the current evidence.
The psychology seems to combine a binary view of truth combined with raising the burden of proof for low status beliefs: people are allowed to “round-down” or even floor their subjective probabilities for undesirable beliefs. Any probability less than 50% (or 90%, in some discussions) can be treated the same.
Unfortunately, the English language (and probably others, too) is horribly bad for communication about probability, allowing such sorts of forms of sophistry to flourish. And the real world is often insufficient to punish educated middle-class people for rounding or flooring the probabilities in the socially desirable direction, even though people making such abuses of probability would get destroyed in many practical endeavours (e.g. betting).
One method for avoiding bias is to identify when one is tempted to engage in such rounding and flooring of probabilities.
I see your point. I agree that these people are moving away from a local optimum of happiness by gaining true beliefs.
As to the global optimum, it’s hard to say. I guess it’s plausible that the best of all possible happinesses involves false beliefs. Does it make sense that I have a strong ethical intuition to reject that kind of happiness?
(Anecdotally, I find the more I know about my loved ones’ foibles, the more I look on them fondly as fellow creatures.)
As one example, many people have unrealistic idealized views of some important persons in their lives—their parents, for example, or significant others. If they subject these views to rational scrutiny, and perhaps also embark on fact-finding missions about these persons’ embarrassing past mistakes and personal failings, their new opinions will likely be more accurate, but it may make them much unhappier, and possibly also shatter their relationships, with all sorts of potential awful consequences.
Consequences like… getting out of a relationship founded on horror and lies? I agree that could be painful, but I have a hard time seeing it as a net loss.
…true beliefs that will have negative signaling consequences or drastically reduce one’s happiness.
Do you have some examples of such beliefs?
Here’s a good example:
“The paper that supports the conventional wisdom is Jensen, A. R., & Reynolds, C. R. (1983). It finds that females have a 101.41 mean IQ with a 13.55 standard deviation versus males that have a 103.08 mean IQ with a 14.54 standard deviation.”
Now, people will lynch you for that difference of 1.67 IQ points (1.63 %), unless you make excuses for some kind of bias or experimental error. For one thing, the overall average IQ is supposed to be 100. Also some studies have females with the higher IQ.
But what about that other bit, the 7% difference in standard deviation? Stated like this, it is largely inoffensive because people who know enough math to understand what it means, usually know to disregard slight statistical variations in the face of specific evidence. But what if you take that to its logical conclusions concerning the male/female ratio of the top 0.1% smartest people, and then tell other people your calculated ratio? (to make sure it is a true belief, state it as “this study, plus this calculation, results in...”) If you state such a belief, people will take it as a signal that you would consider maleness to be evidence of being qualified. And, since people are bad at math and will gladly follow a good cause regardless of truth, almost no one will care that looking at actual qualifications is necessarily going to swamp any effects from statistics, nor will they care whether it is supported by a scientific study (weren’t those authors both males?). And the good cause people aren’t even wrong—considering that people are bad at math, and there is discrimination against women, knowledge of that study will likely increase discrimination, either through ignorance or intentional abuse—regardless of whether the study was accurate.
If you accept the above belief, but decide letting others know about your belief is a bad idea, then you still have to spend some amount of effort guarding least you let slip your secret in your speech or actions. And odds are, such a belief would provide you zero benefits while exposing you to a small but constant loss of mental resources and a risk of social catastrophe.
But what if you take that to its logical conclusions concerning the male/female ratio of the top 0.1% smartest people, and then tell other people your calculated ratio?
But what if you take that to its logical conclusions concerning the male/female ratio of the top 0.1% smartest people, and then tell other people your calculated ratio?
You might be able to inoculate yourself against that by also calculating and quoting the conjugate male/female ratio of the lowest 0.1% of the population. Which is really something you should be doing anyway any time you look at a highest or lowest X% of anything, lest people take your information as advice to build smaller schools, or move to the country to prevent cancer.
You might be able to inoculate yourself against that by also calculating and quoting the conjugate male/female ratio of the lowest 0.1% of the population.
Why would that “inoculate” you? Yeah, it makes it obvious that you’re not talking about a mean difference (except for, you know, the real mean difference found in the study), but saying “there are more men than women in prisons and more men than women that are math professors at Harvard” is still not gender egalitarian.
Using that figures, 0.117% of males and 0.083% of females have IQs below 58.814, so if the sex ratio in whatever-you’re-thinking-of is much greater than 1.4 males per female, something else is going on.
Using that figures, 0.152% of males and 0.048% of females have IQs over 146.17, so if the sex ratio in whatever-you’re-thinking-of is much greater than 3.2 males per female, something else is going on.
In my experience, practically speaking though not theoretically, true beliefs are literally always beneficial relative to false ones, though not always worth the cost of acquiring them.
The main problem I see with this post is that it assumes that it’s always advantageous to find out the truth and update one’s beliefs towards greater factual and logical accuracy. Supposedly, the only danger of questioning things too much is that attempts to do so might malfunction and instead move one towards potentially dangerous false beliefs (which I assume is meant by the epithets such as “nutty” and “crazy”).
Yet I find this assumption entirely unwarranted. The benefits of holding false beliefs can be greater than the costs. This typically happens when certain false beliefs have high positive signaling value, but don’t imply any highly costly or dangerous behavior. Questioning and correcting such beliefs can incur far more cost than benefit; one can try to continue feigning them, but for most people it will be at least somewhat difficult and unpleasant. There are also many situations where the discovery of truth can make one’s life miserable for purely personal reasons, and it’s in the best interest of one’s happiness to avoid snooping and questioning things too much.
It seems to me that the problem for uncompromising truth-seekers is not just how to avoid invalid reasoning leading to crazy false beliefs, but also how to avoid forming true beliefs that will have negative signaling consequences or drastically reduce one’s happiness. Now, maybe you would argue that one should always strive for truth no matter what, but this requires a separate argument in addition to what’s presented in the above post—which is by itself insufficient to address the reasons for why people are “afraid to think fully about certain subjects.”
Speaking from experience, avoiding too much thought about true beliefs that negatively impact one’s happiness without giving any value is done by monitoring one’s happiness. Or possibly by working on depression.
For quite some time, my thoughts would keep going back to the idea that your government can kill you at any time (the Holocaust). Your neighbors can kill you at any time. (Rwanda)
Eventually, I noticed that such thoughts were driven by an emotional pull rather than their relevance to anything I wanted or needed.
There’s still some residue—after all, it’s a true thought, and I don’t think I’m just spreading depression to occasionally point out that governments could build UFAI or be a danger to people working on FAI.
Unfortunately, while I remember the process of prying myself loose from that obsession, I don’t remember what might have led to the inspiration to look at those thoughts from the outside.
More generally, I believe there’s an emotional immune system, and it works better for some people than others, at some times than others, and probably (for an individual) about some subjects than others.
Do you have some examples of such beliefs?
The problem with the most poignant examples is that it’s impossible to find beliefs that signal low status and/or disreputability in the modern mainstream society, and are also uncontroversially true. The mention of any concrete belief that is, to the best of my knowledge, both true and disreputable will likely lead to a dispute over whether it’s really true. Yet, claiming that there are no such beliefs at all is a very strong assertion, especially considering that nobody could deny that this would constitute a historically unprecedented state of affairs.
To avoid getting into such disputes, I’ll give only two weaker and (hopefully) uncontroversial examples.
As one example, many people have unrealistic idealized views of some important persons in their lives—their parents, for example, or significant others. If they subject these views to rational scrutiny, and perhaps also embark on fact-finding missions about these persons’ embarrassing past mistakes and personal failings, their new opinions will likely be more accurate, but it may make them much unhappier, and possibly also shatter their relationships, with all sorts of potential awful consequences. This seems like a clear and realistic example where less accurate beliefs are in the best interest of everyone involved.
Or, to take another example, the post mentions people who expend some effort to follow certain forms of religious observance. For many people in various religious and ethnic groups, such behavior produces pleasant feelings of one’s own virtuousness, as well as positive signals to others that one is a committed, virtuous, and respectable member of the community, with all sorts of advantages that follow from that. Now, if such a person scrutinizes the beliefs on which this behavior is based, and concludes that they’re just superstitious nonsense, they will be forced to choose between the onerous and depressing burden of maintaining a dishonest facade or abandoning their observance and facing awful social consequences. I don’t see how this can be possibly seen as beneficial, even though it would mean that their beliefs would become closer to reality.
This is a good point. Most ideas that are mistreated by modern mainstream society are not obviously true. Rather, they are treated as much less probable than a less-biased assessment would estimate. This tendency leads to many ideas being given a probability of 0%, when they really deserve a probability of 40-60% based on the current evidence. This is consistent with your experience (and mine) of examining various controversies and being unable to tell which positions are actually correct, based on the current evidence.
The psychology seems to combine a binary view of truth combined with raising the burden of proof for low status beliefs: people are allowed to “round-down” or even floor their subjective probabilities for undesirable beliefs. Any probability less than 50% (or 90%, in some discussions) can be treated the same.
Unfortunately, the English language (and probably others, too) is horribly bad for communication about probability, allowing such sorts of forms of sophistry to flourish. And the real world is often insufficient to punish educated middle-class people for rounding or flooring the probabilities in the socially desirable direction, even though people making such abuses of probability would get destroyed in many practical endeavours (e.g. betting).
One method for avoiding bias is to identify when one is tempted to engage in such rounding and flooring of probabilities.
I see your point. I agree that these people are moving away from a local optimum of happiness by gaining true beliefs.
As to the global optimum, it’s hard to say. I guess it’s plausible that the best of all possible happinesses involves false beliefs. Does it make sense that I have a strong ethical intuition to reject that kind of happiness?
(Anecdotally, I find the more I know about my loved ones’ foibles, the more I look on them fondly as fellow creatures.)
Consequences like… getting out of a relationship founded on horror and lies? I agree that could be painful, but I have a hard time seeing it as a net loss.
Here’s a good example: “The paper that supports the conventional wisdom is Jensen, A. R., & Reynolds, C. R. (1983). It finds that females have a 101.41 mean IQ with a 13.55 standard deviation versus males that have a 103.08 mean IQ with a 14.54 standard deviation.”
Now, people will lynch you for that difference of 1.67 IQ points (1.63 %), unless you make excuses for some kind of bias or experimental error. For one thing, the overall average IQ is supposed to be 100. Also some studies have females with the higher IQ.
But what about that other bit, the 7% difference in standard deviation? Stated like this, it is largely inoffensive because people who know enough math to understand what it means, usually know to disregard slight statistical variations in the face of specific evidence. But what if you take that to its logical conclusions concerning the male/female ratio of the top 0.1% smartest people, and then tell other people your calculated ratio? (to make sure it is a true belief, state it as “this study, plus this calculation, results in...”) If you state such a belief, people will take it as a signal that you would consider maleness to be evidence of being qualified. And, since people are bad at math and will gladly follow a good cause regardless of truth, almost no one will care that looking at actual qualifications is necessarily going to swamp any effects from statistics, nor will they care whether it is supported by a scientific study (weren’t those authors both males?). And the good cause people aren’t even wrong—considering that people are bad at math, and there is discrimination against women, knowledge of that study will likely increase discrimination, either through ignorance or intentional abuse—regardless of whether the study was accurate.
If you accept the above belief, but decide letting others know about your belief is a bad idea, then you still have to spend some amount of effort guarding least you let slip your secret in your speech or actions. And odds are, such a belief would provide you zero benefits while exposing you to a small but constant loss of mental resources and a risk of social catastrophe.
This actually checks out.
You might be able to inoculate yourself against that by also calculating and quoting the conjugate male/female ratio of the lowest 0.1% of the population. Which is really something you should be doing anyway any time you look at a highest or lowest X% of anything, lest people take your information as advice to build smaller schools, or move to the country to prevent cancer.
Why would that “inoculate” you? Yeah, it makes it obvious that you’re not talking about a mean difference (except for, you know, the real mean difference found in the study), but saying “there are more men than women in prisons and more men than women that are math professors at Harvard” is still not gender egalitarian.
Using that figures, 0.117% of males and 0.083% of females have IQs below 58.814, so if the sex ratio in whatever-you’re-thinking-of is much greater than 1.4 males per female, something else is going on.
Using that figures, 0.152% of males and 0.048% of females have IQs over 146.17, so if the sex ratio in whatever-you’re-thinking-of is much greater than 3.2 males per female, something else is going on.
The zero of the scale is arbitrary, so the “1.63%” is meaningless.
In my experience, practically speaking though not theoretically, true beliefs are literally always beneficial relative to false ones, though not always worth the cost of acquiring them.