In Defense of the Fundamental Attribution Error

The Fun­da­men­tal At­tri­bu­tion Error

Also known, more ac­cu­rately, as “Cor­re­spon­dence Bias.”

http://​​less­wrong.com/​​lw/​​hz/​​cor­re­spon­dence_bias/​​

The “more ac­cu­rately” part is pretty im­por­tant; bias -may- re­sult in er­ror, but need not -nec­es­sar­ily- do so, and in some cases may re­sult in re­duced er­ror.

A Sim­ple Example

Sup­pose I write a stupid ar­ti­cle that makes no sense and ram­bles on with­out any co­her­ent point. There might be a situ­a­tional cause of this; maybe I’m tired. Cor­rect­ing for cor­re­spon­dence bias means that more weight should be given to the situ­a­tional ex­pla­na­tion than the dis­po­si­tional ex­pla­na­tion, that I’m the sort of per­son who writes stupid ar­ti­cles that ram­ble on. The ques­tion be­comes, how­ever, whether or not this in­creases the ac­cu­racy of your as­sess­ment of me; does cor­rect­ing for this bias make you, in fact, less wrong?

In this spe­cific case, no, it doesn’t. A per­son who be­longs to the class of peo­ple who write stupid ar­ti­cles is more likely to write stupid ar­ti­cles than a per­son who doesn’t be­long to that class—I’d be sur­prised if I ever saw Gw­ern write any­thing that wasn’t well-con­sid­ered, well-struc­tured, and well-cited. If some­body like Gw­ern or Eliezer wrote a re­ally stupid ar­ti­cle, we have suffi­cient ev­i­dence that he’s not a mem­ber of that class of peo­ple to make that con­clu­sion a poor one; the situ­a­tional ex­pla­na­tion is bet­ter, he’s hav­ing some kind of off day. How­ever, given an ar­bi­trary stupid ar­ti­cle writ­ten by some­body for which we have no prior in­for­ma­tion, the dis­tri­bu­tion is sub­stan­tially differ­ent. We have differ­ent pri­ors for “Ran­domly cho­sen per­son X writes ar­ti­cle” and “Ar­ti­cle is bad” im­plies “X is a bad writer of ar­ti­cles” than we do for “Well-known ar­ti­cle au­thor Y writes ar­ti­cle” and “Ar­ti­cle is bad” im­plies “Y is a bad writer of ar­ti­cles”.

Get­ting to the Point

The FAE is putting em­pha­sis on in­ter­nal fac­tors rather than ex­ter­nal. It’s jump­ing first to the con­clu­sion that some­body who just swerved is a bad driver, rather than first con­sid­er­ing the pos­si­bil­ity that there was an ob­ject in the road they were avoid­ing, given only the ev­i­dence that they swerved. Whether or not the FAE is an er­ror—whether it is more wrong—de­pends on whether or not the con­clu­sion you jumped to was cor­rect, and more im­por­tantly, whether, on av­er­age, that con­clu­sion would be cor­rect.

It’s very easy to pro­duce stud­ies in which the FAE re­sults in peo­ple mak­ing in­cor­rect judge­ments. This is not, how­ever, the same as the FAE re­sult­ing in an av­er­age of more in­cor­rect judge­ments in the real world.

Cor­re­spon­dence Bias as In­ter­nal Rationalization

I’d sug­gest the ma­jor is­sue with cor­re­spon­dence bias is not, as com­monly pre­sented, in­cor­rectly in­ter­pret­ing the be­hav­ior of other peo­ple—rather, the ma­jor is­sue is with in­cor­rectly in­ter­pret­ing your own be­hav­ior. The er­ror is not in how you in­ter­pret other peo­ples’ be­hav­iors, but in how you in­ter­pret your own.

Turn­ing to Eliezer’s ex­am­ple in the linked ar­ti­cle, if you find your­self kick­ing vend­ing ma­chines, maybe the an­swer is that -you- are a nat­u­rally an­gry per­son, or, as I would pre­fer to phrase it, you have poor self-con­trol. The “float­ing his­tory” Eliezer refers to sounds more to me like ra­tio­nal­iza­tions for poor be­hav­ior than any­thing ap­proach­ing “good” rea­sons for ex­press­ing your anger through vi­o­lence di­rected at inan­i­mate ob­jects. I no­ticed -many- of those ra­tio­nal­iza­tions crop­ping up when I quit smok­ing—“Oh, I’m hav­ing a ter­rible day, I could just have one cigarette to take the edge off.” I don’t walk by a smoker and as­sume they had a ter­rible day, how­ever, be­cause those were -ex­cuses- for a be­hav­ior that I shouldn’t be en­gag­ing in.

It’s pos­si­ble, of course, that Eliezer’s ex­am­ple was sim­ply a poorly cho­sen one; the ex­am­ples in stud­ies cer­tainly seem bet­ter, such as as­sum­ing the au­thors of ar­ti­cles held the po­si­tions they wrote about. But the ex­am­ples used in those stud­ies are also ex­traor­di­nar­ily ar­tifi­cial, at least in in­di­vi­d­u­al­is­tic coun­tries, where it’s as­sumed, and gen­er­ally true, that peo­ple writ­ing ar­ti­cles do have the free­dom to write what they agree with, and in­fringe­ments of this (say, in the con­text of a news­pa­per ask­ing a colum­nist to change a re­view to be less hos­tile to an ad­ver­tiser) are re­garded very harshly.

Col­lec­tivist ver­sus In­di­vi­d­u­al­ist Countries

There’s been some re­search done, com­par­ing col­lec­tivist so­cieties to in­di­vi­d­u­al­ist so­cieties; col­lec­tivist so­cieties don’t pre­sent the same level of effect from the cor­re­spon­dence bias. A point to con­sider, how­ever, is that in col­lec­tivist so­cieties, the ar­tifi­cial sce­nar­ios used in stud­ies are more “nat­u­ral”—it’s part of their so­ciety to ad­just them­selves to the cir­cum­stances, whereas in­di­vi­d­u­al­ist so­cieties see cir­cum­stance as some­thing that should be adapted to the in­di­vi­d­ual. It’s -not- an in­fringe­ment, or un­ex­pected, for the state-owned news­pa­per to re­quire ev­ery­thing writ­ten to be pro-state.

Maybe the differ­ing lev­els of effect are less a mat­ter of “Col­lec­tivist so­cieties are more sen­si­tive to en­vi­ron­ment” so much as that, in both cul­tures, the cal­ibra­tion of a heuris­tic is ac­cu­rate, but it’s sim­ply cal­ibrated to differ­ent test cases.

Conclusion

I don’t have any­thing con­clu­sive to say, here, merely a po­si­tion: The Cor­re­spon­dence Bias is a bias that, on the whole, helps peo­ple ar­rive at more ac­cu­rate, rather than less ac­cu­rate, con­clu­sions, and should be cor­rected with care to im­prov­ing ac­cu­racy and cor­rect­ness, rather than the mere elimi­na­tion of bias.