Confirmation Bias As Misfire Of Normal Bayesian Reasoning

Link post

From the sub­red­dit: Hu­mans Are Hard­wired To Dis­miss Facts That Don’t Fit Their Wor­ld­view. Once you get through the pre­limi­nary Trump sup­porter and anti-vaxxer de­nun­ci­a­tions, it turns out to be an at­tempt at an evo psych ex­pla­na­tion of con­fir­ma­tion bias:

Our an­ces­tors evolved in small groups, where co­op­er­a­tion and per­sua­sion had at least as much to do with re­pro­duc­tive suc­cess as hold­ing ac­cu­rate fac­tual be­liefs about the world. As­simila­tion into one’s tribe re­quired as­simila­tion into the group’s ide­olog­i­cal be­lief sys­tem. An in­stinc­tive bias in fa­vor of one’s in-group” and its wor­ld­view is deeply in­grained in hu­man psy­chol­ogy.

I think the ar­ti­cle as a whole makes good points, but I’m in­creas­ingly un­cer­tain that con­fir­ma­tion bias can be sep­a­rated from nor­mal rea­son­ing.

Sup­pose that one of my friends says she saw a coy­ote walk by her house in Berkeley. I know there are coy­otes in the hills out­side Berkeley, so I am not too sur­prised; I be­lieve her.

Now sup­pose that same friend says she saw a po­lar bear walk by her house. I as­sume she is mis­taken, ly­ing, or hal­lu­ci­nat­ing.

Is this con­fir­ma­tion bias? It sure sounds like it. When some­one says some­thing that con­firms my pre­ex­ist­ing be­liefs (eg ‘coy­otes live in this area, but not po­lar bears’), I be­lieve it. If that same per­son pro­vides the same ev­i­dence for some­thing that challenges my pre­ex­ist­ing be­liefs, I re­ject it. What am I do­ing differ­ently from an anti-vaxxer who re­jects any in­for­ma­tion that challenges her pre­ex­ist­ing be­liefs (eg that vac­cines cause autism)?

When new ev­i­dence challenges our es­tab­lished pri­ors (eg a friend re­ports a po­lar bear, but I have a strong prior that there are no po­lar bears around), we ought to heav­ily dis­count the ev­i­dence and slightly shift our prior. So I should end up be­liev­ing that my friend is prob­a­bly wrong, but I should also be slightly less con­fi­dent in my as­ser­tion that there are no po­lar bears loose in Berkeley to­day. This seems suffi­cient to ex­plain con­fir­ma­tion bias, ie a ten­dency to stick to what we already be­lieve and re­ject ev­i­dence against it.

The anti-vaxxer is still do­ing some­thing wrong; she some­how man­aged to get a very strong prior on a false state­ment, and isn’t weigh­ing the new ev­i­dence heav­ily enough. But I think it’s im­por­tant to note that she’s at­tempt­ing to carry out nor­mal rea­son­ing, and failing, rather than car­ry­ing out some spe­cial kind of rea­son­ing called “con­fir­ma­tion bias”.

There are some im­por­tant re­fine­ments to make to this model – maybe there’s a spe­cial “emo­tional rea­son­ing” that locks down pri­ors more tightly, and maybe peo­ple nat­u­rally over­weight pri­ors be­cause that was adap­tive in the an­ces­tral en­vi­ron­ment. Maybe af­ter you add these re­fine­ments, you end up at ex­actly the tra­di­tional model of con­fir­ma­tion bias (and the one the Fast Com­pany ar­ti­cle is us­ing) and my ob­jec­tion be­comes kind of pointless.

But not com­pletely pointless. I still think it’s helpful to ap­proach con­fir­ma­tion bias by think­ing of it as a nor­mal form of rea­son­ing, and then ask­ing un­der what con­di­tions it fails.