[link] I Was Wrong, and So Are You

A article in the Atlantic, linked to by someone on the unofficial LW IRC channel caught my eye. Nothing all that new for LessWrong readers, but still it is good to see any mention of such biases in mainstream media.

I Was Wrong, and So Are You

A libertarian economist retracts a swipe at the left—after discovering that our political leanings leave us more biased than we think.

...

You may have noticed that several of the statements we analyzed implicitly challenge positions held by the left, while none specifically challenges conservative or libertarian positions. A great deal of research shows that people are more likely to heed information that supports their prior positions, and discard or discount contrary information. Suppose that on some public issue, Anne favors position A, and Burt favors position B. Anne is more likely than Burt to agree with statements that support A, and to disagree with statements that support B, because doing so simplifies her case for favoring A. Otherwise, she would have to make a concession to the opposing side. Psychologists would count this tendency as a manifestation of “myside bias,” or “confirmation bias.”

Buturovic and I openly acknowledged that the set of eight statements was biased. But these were the statements we had available to us. And as we explained in the paper, some of them—including those on professional licensing, standard of living, monopoly, and trade—did not appear to fit neatly into a partisan debate. Yet even on those, respondents on the left fared worst. What’s more, in separate research, Buturovic found that the respondents themselves either had difficulty classifying some of the statements on an ideological scale, or simply believed those statements were not, prima facie, ideological. So while we thought the results were probably exaggerated because of the bias in the survey, we nonetheless felt that they were telling.

Buturovic and I largely refrained from replying to the criticism (much of which focused on myside bias) that followed publication of the article. Instead, we planned a second survey that would balance the first one by including questions that would challenge conservative and/​or libertarian positions.

...

Buturovic began putting all 17 questions to a new group of respondents last December. I eagerly awaited the results, hoping that the conservatives and especially the libertarians (my side!) would exhibit less myside bias. Buturovic was more detached. She e-mailed me the results, and commented that conservatives and libertarians did not do well on the new questions. After a hard look, I realized that they had bombed on the questions that challenged their position. A full tabulation of all 17 questions showed that no group clearly out-stupids the others. They appear about equally stupid when faced with proper challenges to their position.

Writing up these results was, for me, a gloomy task—I expected critics to gloat and point fingers. In May, we published another paper in Econ Journal Watch, saying in the title that the new results “Vitiate Prior Evidence of the Left Being Worse.” More than 30 percent of my libertarian compatriots (and more than 40 percent of conservatives), for instance, disagreed with the statement “A dollar means more to a poor person than it does to a rich person”—c’mon, people!—versus just 4 percent among progressives. Seventy-eight percent of libertarians believed gun-control laws fail to reduce people’s access to guns. Overall, on the nine new items, the respondents on the left did much better than the conservatives and libertarians. Some of the new questions challenge (or falsely reassure) conservative and not libertarian positions, and vice versa. Consistently, the more a statement challenged a group’s position, the worse the group did.

The reaction to the new paper was quieter than I expected. Jonathan Chait, who had knocked the first paper, wrote a forgiving notice on his New Republic blog: “Insult Retractions: A (Very) Occasional Feature.” Matthew Yglesias, writing at ThinkProgress, summed up the takeaway: “Basically, there’s a lot of confirmation bias out there.” Nothing illustrates that point better than my confidence in the claims of the first paper, especially as distilled in my Wall Street Journal op-ed.

Shouldn’t a college professor have known better?

I break here to comment that I don’t see why we would expect this to be so given the reality of academia.

Perhaps. But adjusting for bias and groupthink is not so easy, as indicated by one of the major conclusions developed by Buturovic and sustained in our joint papers. Education had very little impact on responses, we found; survey respondents who’d gone to college did only slightly less badly than those who hadn’t. Among members of less-educated groups, brighter people tend to respond more frequently to online surveys, so it’s likely that our sample of non-college-educated respondents is more enlightened than the larger group they represent. Still, the fact that a college education showed almost no effect—at least for those inclined to take such a survey—strongly suggests that the classroom is no great corrective for myside bias. At least when it comes to public-policy issues, the corrective value of professional academic experience might be doubted as well.

Discourse affords some opportunity to challenge the judgments of others and to revise our own. Yet inevitably, somewhere in the process, we place what faith we have.