Correspondence Bias

The cor­re­spon­dence bias is the ten­dency to draw in­fer­ences about a per­son’s unique and en­dur­ing dis­po­si­tions from be­hav­iors that can be en­tirely ex­plained by the situ­a­tions in which they oc­cur.

—Gilbert and Malone1

We tend to see far too di­rect a cor­re­spon­dence be­tween oth­ers’ ac­tions and per­son­al­ities. When we see some­one else kick a vend­ing ma­chine for no visi­ble rea­son, we as­sume they are “an an­gry per­son.” But when you your­self kick the vend­ing ma­chine, it’s be­cause the bus was late, the train was early, your re­port is over­due, and now the damned vend­ing ma­chine has eaten your lunch money for the sec­ond day in a row. Surely, you think to your­self, any­one would kick the vend­ing ma­chine, in that situ­a­tion.

We at­tribute our own ac­tions to our situ­a­tions, see­ing our be­hav­iors as perfectly nor­mal re­sponses to ex­pe­rience. But when some­one else kicks a vend­ing ma­chine, we don’t see their past his­tory trailing be­hind them in the air. We just see the kick, for no rea­son we know about, and we think this must be a nat­u­rally an­gry per­son—since they lashed out with­out any provo­ca­tion.

Yet con­sider the prior prob­a­bil­ities. There are more late buses in the world, than mu­tants born with un­nat­u­rally high anger lev­els that cause them to some­times spon­ta­neously kick vend­ing ma­chines. Now the av­er­age hu­man is, in fact, a mu­tant. If I re­call cor­rectly, an av­er­age in­di­vi­d­ual has two to ten so­mat­i­cally ex­pressed mu­ta­tions. But any given DNA lo­ca­tion is very un­likely to be af­fected. Similarly, any given as­pect of some­one’s dis­po­si­tion is prob­a­bly not very far from av­er­age. To sug­gest oth­er­wise is to shoulder a bur­den of im­prob­a­bil­ity.

Even when peo­ple are in­formed ex­plic­itly of situ­a­tional causes, they don’t seem to prop­erly dis­count the ob­served be­hav­ior. When sub­jects are told that a pro-abor­tion or anti-abor­tion speaker was ran­domly as­signed to give a speech on that po­si­tion, sub­jects still think the speak­ers har­bor lean­ings in the di­rec­tion ran­domly as­signed.2

It seems quite in­tu­itive to ex­plain rain by wa­ter spirits; ex­plain fire by a fire-stuff (phlo­gis­ton) es­cap­ing from burn­ing mat­ter; ex­plain the so­porific effect of a med­i­ca­tion by say­ing that it con­tains a “dor­mi­tive po­tency.” Real­ity usu­ally in­volves more com­pli­cated mechanisms: an evap­o­ra­tion and con­den­sa­tion cy­cle un­der­ly­ing rain, ox­i­diz­ing com­bus­tion un­der­ly­ing fire, chem­i­cal in­ter­ac­tions with the ner­vous sys­tem for so­porifics. But mechanisms sound more com­pli­cated than essences; they are harder to think of, less available. So when some­one kicks a vend­ing ma­chine, we think they have an in­nate vend­ing-ma­chine-kick­ing-ten­dency.

Un­less the “some­one” who kicks the ma­chine is us—in which case we’re be­hav­ing perfectly nor­mally, given our situ­a­tions; surely any­one else would do the same. In­deed, we over­es­ti­mate how likely oth­ers are to re­spond the same way we do—the “false con­sen­sus effect.” Drink­ing stu­dents con­sid­er­ably over­es­ti­mate the frac­tion of fel­low stu­dents who drink, but non­drinkers con­sid­er­ably un­der­es­ti­mate the frac­tion. The “fun­da­men­tal at­tri­bu­tion er­ror” refers to our ten­dency to over­at­tribute oth­ers’ be­hav­iors to their dis­po­si­tions, while re­vers­ing this ten­dency for our­selves.

To un­der­stand why peo­ple act the way they do, we must first re­al­ize that ev­ery­one sees them­selves as be­hav­ing nor­mally. Don’t ask what strange, mu­tant dis­po­si­tion they were born with, which di­rectly cor­re­sponds to their sur­face be­hav­ior. Rather, ask what situ­a­tions peo­ple see them­selves as be­ing in. Yes, peo­ple do have dis­po­si­tions—but there are not enough her­i­ta­ble quirks of dis­po­si­tion to di­rectly ac­count for all the sur­face be­hav­iors you see.

Sup­pose I gave you a con­trol with two but­tons, a red but­ton and a green but­ton. The red but­ton de­stroys the world, and the green but­ton stops the red but­ton from be­ing pressed. Which but­ton would you press? The green one. Any­one who gives a differ­ent an­swer is prob­a­bly over­com­pli­cat­ing the ques­tion.3

And yet peo­ple some­times ask me why I want to save the world.4 Like I must have had a trau­matic child­hood or some­thing. Really, it seems like a pretty ob­vi­ous de­ci­sion . . . if you see the situ­a­tion in those terms.

I may have non-av­er­age views which call for ex­pla­na­tion—why do I be­lieve such things, when most peo­ple don’t?—but given those be­liefs, my re­ac­tion doesn’t seem to call forth an ex­cep­tional ex­pla­na­tion. Per­haps I am a vic­tim of false con­sen­sus; per­haps I over­es­ti­mate how many peo­ple would press the green but­ton if they saw the situ­a­tion in those terms. But y’know, I’d still bet there’d be at least a sub­stan­tial minor­ity.

Most peo­ple see them­selves as perfectly nor­mal, from the in­side. Even peo­ple you hate, peo­ple who do ter­rible things, are not ex­cep­tional mu­tants. No mu­ta­tions are re­quired, alas. When you un­der­stand this, you are ready to stop be­ing sur­prised by hu­man events.

1Daniel T. Gilbert and Pa­trick S. Malone, “The Cor­re­spon­dence Bias,” Psy­cholog­i­cal Bul­letin 117, no. 1 (1995): 21–38.

2Ed­ward E. Jones and Vic­tor A. Har­ris, “The At­tri­bu­tion of At­ti­tudes,” Jour­nal of Ex­per­i­men­tal So­cial Psy­chol­ogy 3 (1967): 1–24, http://​www.rad­ford.edu/​~jaspelme/​443/​spring-2007/​Ar­ti­cles/​Jones_n_Har­ris_1967.pdf.

3Com­pare “Tran­shu­man­ism as Sim­plified Hu­man­ism.” http://​yud­kowsky.net/​sin­gu­lar­ity/​sim­plified.

4See Eliezer Yud­kowsky, “Ar­tifi­cial In­tel­li­gence as a Pos­i­tive and Nega­tive Fac­tor in Global Risk,” in Global Catas­trophic Risks, ed. Nick Bostrom and Milan M. irkovi (New York: Oxford Univer­sity Press, 2008), 308–345.