We Change Our Minds Less Often Than We Think

Over the past few years, we have dis­creetly ap­proached col­leagues faced with a choice be­tween job offers, and asked them to es­ti­mate the prob­a­bil­ity that they will choose one job over an­other. The av­er­age con­fi­dence in the pre­dicted choice was a mod­est 66%, but only 1 of the 24 re­spon­dents chose the op­tion to which he or she ini­tially as­signed a lower prob­a­bil­ity, yield­ing an over­all ac­cu­racy rate of 96%.

—Dale Griffin and Amos Tver­sky1

When I first read the words above—on Au­gust 1st, 2003, at around 3 o’clock in the af­ter­noon—it changed the way I thought. I re­al­ized that once I could guess what my an­swer would be—once I could as­sign a higher prob­a­bil­ity to de­cid­ing one way than other—then I had, in all prob­a­bil­ity, already de­cided. We change our minds less of­ten than we think. And most of the time we be­come able to guess what our an­swer will be within half a sec­ond of hear­ing the ques­tion.

How swiftly that un­no­ticed mo­ment passes, when we can’t yet guess what our an­swer will be; the tiny win­dow of op­por­tu­nity for in­tel­li­gence to act. In ques­tions of choice, as in ques­tions of fact.

The prin­ci­ple of the bot­tom line is that only the ac­tual causes of your be­liefs de­ter­mine your effec­tive­ness as a ra­tio­nal­ist. Once your be­lief is fixed, no amount of ar­gu­ment will al­ter the truth-value; once your de­ci­sion is fixed, no amount of ar­gu­ment will al­ter the con­se­quences.

You might think that you could ar­rive at a be­lief, or a de­ci­sion, by non-ra­tio­nal means, and then try to jus­tify it, and if you found you couldn’t jus­tify it, re­ject it.

But we change our minds less of­ten—much less of­ten—than we think.

I’m sure that you can think of at least one oc­ca­sion in your life when you’ve changed your mind. We all can. How about all the oc­ca­sions in your life when you didn’t change your mind? Are they as available, in your heuris­tic es­ti­mate of your com­pe­tence?

Between hind­sight bias, fake causal­ity, pos­i­tive bias, an­chor­ing/​prim­ing, et cetera, et cetera, and above all the dreaded con­fir­ma­tion bias, once an idea gets into your head, it’s prob­a­bly go­ing to stay there.

1Dale Griffin and Amos Tver­sky, “The Weigh­ing of Ev­i­dence and the Deter­mi­nants of Con­fi­dence,” Cog­ni­tive Psy­chol­ogy 24, no. 3 (1992): 411–435.