The Sequence post Doublethink (Choosing To Be Biased) addresses the general form of this question, which is, “Is it ever optimal to adopt irrational beliefs in order to advance instrumental goals, such as happiness, wealth, etc?”
I’ll quote at length what I think is the relevant part of the post:
For second-order rationality to be genuinely rational, you would first need a good model of reality, to extrapolate the consequences of rationality and irrationality. If you then chose to be first-order irrational, you would need to forget this accurate view. And then forget the act of forgetting. I don’t mean to commit the logical fallacy of generalizing from fictional evidence, but I think Orwell did a good job of extrapolating where this path leads.
You can’t know the consequences of being biased, until you have already debiased yourself. And then it is too late for self-deception.
The other alternative is to choose blindly to remain biased, without any clear idea of the consequences. This is not second-order rationality. It is willful stupidity.
Be irrationally optimistic about your driving skills, and you will be happily unconcerned where others sweat and fear. You won’t have to put up with the inconvenience of a seat belt. You will be happily unconcerned for a day, a week, a year. Then crash, and spend the rest of your life wishing you could scratch the itch in your phantom limb. Or paralyzed from the neck down. Or dead. It’s not inevitable, but it’s possible; how probable is it? You can’t make that tradeoff rationally unless you know your real driving skills, so you can figure out how much danger you’re placing yourself in. You can’t make that tradeoff rationally unless you know about biases like neglect of probability.
No matter how many days go by in blissful ignorance, it only takes a single mistake to undo a human life, to outweigh every penny you picked up from the railroad tracks of stupidity.
In other words, the trouble with wilfully blinding yourself to reality is that you don’t get to choose what you’re blinding yourself to. It’s very difficult to say, “I’m going to ignore rationality for these specific domains, and only these specific domains.” The human brain really isn’t set up like that. If you’re going to abandon rational thought in favor of religious thought, are you sure you’ll be able to stop before you’re, e.g. questioning the efficacy of vaccines?
The Sequence post Doublethink (Choosing To Be Biased) addresses the general form of this question, which is, “Is it ever optimal to adopt irrational beliefs in order to advance instrumental goals, such as happiness, wealth, etc?”
I’ll quote at length what I think is the relevant part of the post:
In other words, the trouble with wilfully blinding yourself to reality is that you don’t get to choose what you’re blinding yourself to. It’s very difficult to say, “I’m going to ignore rationality for these specific domains, and only these specific domains.” The human brain really isn’t set up like that. If you’re going to abandon rational thought in favor of religious thought, are you sure you’ll be able to stop before you’re, e.g. questioning the efficacy of vaccines?
Another way of looking at the situation is by thinking about The Litany of Gendlin: