If you keep telling yourself that you can’t just deliberately choose to believe the sky is green—then you’re less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore’s Paradox, belief in belief, or belief in self-deception.
If you keep telling yourself that you’d just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn’t be able to invest any credulity in it—
If you keep telling yourself that reflective consistency will take over and make you stop believing on the object level, once you come to the meta-level realization that the map is not reflecting—
Then when push comes to shove—you may, indeed, fail.
I’d phrase this differently. It’s pretty clear from general evidence, not just from the examples provided, that people readily can believe contradictory things simultaneously. It’s even commonly recognized, so much so that we have a well known word for it: cognitive dissonance. Psychologists use the term to denote the unpleasant feeling we get when we have two or more conflicting beliefs, ideas, or values. The human mind is incredibly plastic, and is constantly changing as we acquire new information. Given how complex the mind is, it’s not terribly surprising that updating one belief may not cause a related belief to update if the mind doesn’t realize the two are related. But when we eventually realize that there is a contradiction, surely there is a process for correcting it, rather than the weaker belief simply instantly disappearing. In order to infer a little about what that process might look like, we can look at how people reduce and resolve easier to observe types of cognitive dissonance, such as the case where someone’s desires are in conflict. Wikipedia provides this example:
Cognitive dissonance theory is founded on the assumption that individuals seek consistency between their expectations and their reality. Because of this, people engage in a process called dissonance reduction to bring their cognitions and actions in line with one another. This creation of uniformity allows for a lessening of psychological tension and distress. According to Festinger, dissonance reduction can be achieved in four ways. In an example case where a person has adopted the attitude that they will no longer eat high fat food, but is eating a high-fat doughnut, the four methods of reduction would be:
Change behavior or cognition (“I will not eat any more of this doughnut”)
Justify behavior or cognition by changing the conflicting cognition (“I’m allowed to cheat every once in a while”)
Justify behavior or cognition by adding new cognitions (“I’ll spend 30 extra minutes at the gym to work this off”)
Ignore or deny any information that conflicts with existing beliefs (“This doughnut is not high fat”)
What might we get if we tried to generalize this to cases where beliefs are in conflict, rather than desires? Here’s my guess:
Change behavior or cognition (“My belief X is wrong.”)
Justify behavior or cognition by changing the conflicting cognition (“Well, X2 can still be true even if X1 isn’t.”)
Justify behavior or cognition by adding new cognitions (“X may conflict with Y, but Z can fix the issue.” or “If Z is true or the mind/world works like Z, then the apparent conflict between X and Y is explained away!”)
Ignore or deny any information that conflicts with existing beliefs (“Everyone has the right to their own opinion. Right or wrong, I prefer to believe X”, “My belief X doesn’t ACTUALLY conflict with Y”, “You just can’t compare X and Y”, or “You can’t apply logic to X”.)
I wonder if all that is needed to make it easier to choose option 1 is for options 2-4 to become stigmatized. Aka, if every time I am naturally inclined to choose option 4 I am reminded of all discussion on LessWrong about trying not to be an option 4 person, and I naturally identify with the option 1 crowd and want to be more like the option1-ers I admire, then will my gut impulse be more likely to be option 1? Or is changing one’s mind destined to always be a struggle of intellect vs impulse?
I’d phrase this differently. It’s pretty clear from general evidence, not just from the examples provided, that people readily can believe contradictory things simultaneously. It’s even commonly recognized, so much so that we have a well known word for it: cognitive dissonance. Psychologists use the term to denote the unpleasant feeling we get when we have two or more conflicting beliefs, ideas, or values. The human mind is incredibly plastic, and is constantly changing as we acquire new information. Given how complex the mind is, it’s not terribly surprising that updating one belief may not cause a related belief to update if the mind doesn’t realize the two are related. But when we eventually realize that there is a contradiction, surely there is a process for correcting it, rather than the weaker belief simply instantly disappearing. In order to infer a little about what that process might look like, we can look at how people reduce and resolve easier to observe types of cognitive dissonance, such as the case where someone’s desires are in conflict. Wikipedia provides this example:
What might we get if we tried to generalize this to cases where beliefs are in conflict, rather than desires? Here’s my guess:
Change behavior or cognition (“My belief X is wrong.”)
Justify behavior or cognition by changing the conflicting cognition (“Well, X2 can still be true even if X1 isn’t.”)
Justify behavior or cognition by adding new cognitions (“X may conflict with Y, but Z can fix the issue.” or “If Z is true or the mind/world works like Z, then the apparent conflict between X and Y is explained away!”)
Ignore or deny any information that conflicts with existing beliefs (“Everyone has the right to their own opinion. Right or wrong, I prefer to believe X”, “My belief X doesn’t ACTUALLY conflict with Y”, “You just can’t compare X and Y”, or “You can’t apply logic to X”.)
I wonder if all that is needed to make it easier to choose option 1 is for options 2-4 to become stigmatized. Aka, if every time I am naturally inclined to choose option 4 I am reminded of all discussion on LessWrong about trying not to be an option 4 person, and I naturally identify with the option 1 crowd and want to be more like the option1-ers I admire, then will my gut impulse be more likely to be option 1? Or is changing one’s mind destined to always be a struggle of intellect vs impulse?