Sorting Out Sticky Brains

tl;dr: Just be­cause it doesn’t seem like we should be able to have be­liefs we ac­knowl­edge to be ir­ra­tional, doesn’t mean we don’t have them. If this hap­pens to you, here’s a tool to help con­cep­tu­al­ize and work around that phe­nomenon.

There’s a gen­eral feel­ing that by the time you’ve ac­knowl­edged that some be­lief you hold is not based on ra­tio­nal ev­i­dence, it has already evap­o­rated. The very act of re­al­iz­ing it’s not some­thing you should be­lieve makes it go away. If that’s your ex­pe­rience, I ap­plaud your well-or­ga­nized mind! It’s serv­ing you well. This is ex­actly as it should be.

If only we were all so lucky.

Brains are sticky things. They will hang onto com­fortable be­liefs that don’t make sense any­more, view the world through fa­mil­iar filters that should have been dis­carded long ago, see sig­nifi­cances and pat­terns and illu­sions even if they’re known by the rest of the brain to be ir­rele­vant. Beliefs should be formed on the ba­sis of sound ev­i­dence. But that’s not the only mechanism we have in our skulls to form them. We’re equipped to come by them in other ways, too. It’s been ob­served1 that be­liev­ing con­tra­dic­tions is only bad be­cause it en­tails be­liev­ing false­hoods. If you can’t get rid of one be­lief in a con­tra­dic­tion, and that’s the false one, then be­liev­ing a con­tra­dic­tion is the best you can do, be­cause then at least you have the true be­lief too.

The mechanism I use to deal with this is to la­bel my be­liefs “offi­cial” and “un­offi­cial”. My offi­cial be­liefs have a sec­ond-or­der stamp of ap­proval. I be­lieve them, and I be­lieve that I should be­lieve them. Mean­while, the “un­offi­cial” be­liefs are those I can’t get rid of, or am not mo­ti­vated to try re­ally hard to get rid of be­cause they aren’t prob­le­matic enough to be worth the trou­ble. They might or might not out­right con­tra­dict an offi­cial be­lief, but re­gard­less, I try not to act on them.

To those of you with well-or­dered minds (for such lucky peo­ple seem to ex­ist, if we be­lieve some of the self-re­ports on this very site), this prob­a­bly sounds out­ra­geous. If I know they’re prob­a­bly not true… And I do. But they still make me ex­pect things. They make me sur­prised when those ex­pec­ta­tions are flouted. If I’m asked about their sub­jects when tired, or not pre­pared for the ques­tion, they’ll leap out of my mouth be­fore I can stop them, and they won’t feel like lies—be­cause they’re not. They’re be­liefs. I just don’t like them very much.

I’ll sup­ply an ex­am­ple. I have a rather dread­ful pho­bia of guns, and ac­cord­ingly, I think they should be ille­gal. The pho­bia is a ter­rible rea­son to be­lieve in the ap­pro­pri­ate­ness of such a ban: said pho­bia doesn’t even stand in for an in­for­ma­tive real ex­pe­rience, since I haven’t lost a fam­ily mem­ber to a stray bul­let or any­thing of the kind. I cer­tainly don’t as­sent to the gen­eral propo­si­tion “any­thing that scares me should be ille­gal”. I have no other rea­sons, ex­cept for a vague af­fec­tion for a cluster of poli­ti­cal opinions which in­cludes some­thing along those lines, to be­lieve this be­lief. Nei­ther the fear nor the af­fec­tion are rea­sons I en­dorse for be­liev­ing things in gen­eral, or this in par­tic­u­lar. So this is an un­offi­cial be­lief. When­ever I can, I avoid act­ing on it. Un­til I lo­cate some good rea­sons to be­lieve some­thing about the topic, I offi­cially have no opinion. I avoid putting my­self in situ­a­tions where I might act on the un­offi­cial be­lief in the same way I might avoid a store with con­tents for which I have an un­en­dorsed de­sire, like a candy shop. For in­stance, when I read about poli­ti­cal can­di­dates’ stances on is­sues, I avoid what­ever sec­tion talks about gun con­trol.

Be­cause I know my brain col­lects junk like this, I try to avoid mak­ing up my mind un­til I do have a pretty good idea of what’s go­ing on. Once I tell my­self, “Okay, I’ve de­cided”, I run the risk of lodg­ing some­thing per­ma­nently in my cor­tex that won’t re­lease its stran­gle­hold on my thought pro­cess un­til king­dom come. I use tools like “tem­porar­ily op­er­at­ing un­der the as­sump­tion that” (some propo­si­tion) or declar­ing my­self “un­qual­ified to have an opinion about” (some sub­ject). The longer I hold my opinions in a state of un­cer­tainty, the less chance I wind up with a per­ma­nent epistemic par­a­site that I have to de­vote cog­ni­tive re­sources to just to keep it from mak­ing me do dumb things. This is partly be­cause it makes the state of un­cer­tainty come to feel like a de­fault, which makes it sim­pler to slide back to un­cer­tainty again if it seems war­ranted. Partly, it’s be­cause the longer I wait, the more ev­i­dence I’ve col­lected by the time I pick a side, so it’s less likely that the be­lief I ac­quire is one I’ll want to ex­cise in the fu­ture.

This is all well and good as a pro­phy­lac­tic. It doesn’t help as much with stuff that snuck in when I was but a mere slip of a youth. For that, I rely on the offi­cial/​un­offi­cial dis­tinc­tion, and then toe the offi­cial line as best I can in thought, word, and deed. I break in un­comfy offi­cial be­liefs like new shoes. You can use your brain’s love of rou­tine to your ad­van­tage. Act like you only be­lieve the offi­cial be­liefs, and the un­offi­cial ones will weaken from di­suse. This isn’t a be­trayal of your “real” be­liefs. The offi­cial be­liefs are real too! They’re real, and they’re bet­ter.

1I read this in Peter van In­wa­gen’s book “Es­say on Free Will” but seem to re­mem­ber that he got it el­se­where. I’m not cer­tain where my copy has got­ten to lately, so can’t check.