Hindsight Devalues Science

This es­say is closely based on an ex­cerpt from Mey­ers’s Ex­plor­ing So­cial Psy­chol­ogy ; the ex­cerpt is worth read­ing in its en­tirety.

Cul­len Mur­phy, ed­i­tor of The At­lantic, said that the so­cial sci­ences turn up “no ideas or con­clu­sions that can’t be found in [any] en­cy­clo­pe­dia of quo­ta­tions . . . Day af­ter day so­cial sci­en­tists go out into the world. Day af­ter day they dis­cover that peo­ple’s be­hav­ior is pretty much what you’d ex­pect.”

Of course, the “ex­pec­ta­tion” is all hind­sight. (Hind­sight bias: Sub­jects who know the ac­tual an­swer to a ques­tion as­sign much higher prob­a­bil­ities they “would have” guessed for that an­swer, com­pared to sub­jects who must guess with­out know­ing the an­swer.)

The his­to­rian Arthur Sch­lesinger, Jr. dis­missed sci­en­tific stud­ies of World War II sol­diers’ ex­pe­riences as “pon­der­ous demon­stra­tions” of com­mon sense. For ex­am­ple:

1. Bet­ter ed­u­cated sol­diers suffered more ad­just­ment prob­lems than less ed­u­cated sol­diers. (In­tel­lec­tu­als were less pre­pared for bat­tle stresses than street-smart peo­ple.) 2. South­ern sol­diers coped bet­ter with the hot South Sea Is­land cli­mate than North­ern sol­diers. (South­ern­ers are more ac­cus­tomed to hot weather.) 3. White pri­vates were more ea­ger to be pro­moted to non­com­mis­sioned officers than Black pri­vates. (Years of op­pres­sion take a toll on achieve­ment mo­ti­va­tion.) 4. South­ern Blacks preferred South­ern to North­ern White officers. (South­ern officers were more ex­pe­rienced and skil­led in in­ter­act­ing with Blacks.) 5. As long as the fight­ing con­tinued, sol­diers were more ea­ger to re­turn home than af­ter the war ended. (Dur­ing the fight­ing, sol­diers knew they were in mor­tal dan­ger.)

How many of these find­ings do you think you could have pre­dicted in ad­vance? Three out of five? Four out of five? Are there any cases where you would have pre­dicted the op­po­site—where your model takes a hit? Take a mo­ment to think be­fore con­tin­u­ing . . .

. . .

In this demon­stra­tion (from Paul Lazars­feld by way of Mey­ers), all of the find­ings above are the op­po­site of what was ac­tu­ally found.1 How many times did you think your model took a hit? How many times did you ad­mit you would have been wrong? That’s how good your model re­ally was. The mea­sure of your strength as a ra­tio­nal­ist is your abil­ity to be more con­fused by fic­tion than by re­al­ity.

Un­less, of course, I re­versed the re­sults again. What do you think?

Do your thought pro­cesses at this point, where you re­ally don’t know the an­swer, feel differ­ent from the thought pro­cesses you used to ra­tio­nal­ize ei­ther side of the “known” an­swer?

Daphna Baratz ex­posed col­lege stu­dents to pairs of sup­posed find­ings, one true (“In pros­per­ous times peo­ple spend a larger por­tion of their in­come than dur­ing a re­ces­sion”) and one the truth’s op­po­site.2 In both sides of the pair, stu­dents rated the sup­posed find­ing as what they “would have pre­dicted.” Perfectly stan­dard hind­sight bias.

Which leads peo­ple to think they have no need for sci­ence, be­cause they “could have pre­dicted” that.

(Just as you would ex­pect, right?)

Hind­sight will lead us to sys­tem­at­i­cally un­der­value the sur­pris­ing­ness of sci­en­tific find­ings, es­pe­cially the dis­cov­er­ies we un­der­stand —the ones that seem real to us, the ones we can retrofit into our mod­els of the world. If you un­der­stand neu­rol­ogy or physics and read news in that topic, then you prob­a­bly un­der­es­ti­mate the sur­pris­ing­ness of find­ings in those fields too. This un­fairly de­val­ues the con­tri­bu­tion of the re­searchers; and worse, will pre­vent you from notic­ing when you are see­ing ev­i­dence that doesn’t fit what you re­ally would have ex­pected.

We need to make a con­scious effort to be shocked enough.

1 Paul F. Lazars­feld, “The Amer­i­can Solidier—An Ex­pos­i­tory Re­view,” Public Opinion Quar­terly 13, no. 3 (1949): 377–404 .

2 Daphna Baratz, How Jus­tified Is the “Ob­vi­ous” Re­ac­tion? (Stan­ford Univer­sity, 1983) .