Epistemic Luck

Who we learn from and with can profoundly influence our beliefs. There’s no obvious way to compensate. Is it time to panic?

During one of my epistemology classes, my professor admitted (I can’t recall the context) that his opinions on the topic would probably be different had he attended a different graduate school.

What a peculiar thing for an epistemologist to admit!

Of course, on the one hand, he’s almost certainly right. Schools have their cultures, their traditional views, their favorite literature providers, their set of available teachers. These have a decided enough effect that I’ve heard “X was a student of Y” used to mean “X holds views basically like Y’s”. And everybody knows this. And people still show a distinct trend of agreeing with their teachers’ views, even the most controversial—not an unbroken trend, but still an obvious one. So it’s not at all unlikely that, yes, had the professor gone to a different graduate school, he’d believe something else about his subject, and he’s not making a mistake in so acknowledging...

But on the other hand… but… but...

But how can he say that, and look so undubiously at the views he picked up this way? Surely the truth about knowledge and justification isn’t correlated with which school you went to—even a little bit! Surely he knows that!

And he does—and so do I, and it doesn’t stop it from happening. I even identified a quale associated with the inexorable slide towards a consensus position, which made for some interesting introspection, but averted no change of mind. Because what are you supposed to do—resolutely hold to whatever intuitions you walked in with, never mind the coaxing and arguing and ever-so-reasonable persuasions of the environment in which you are steeped? That won’t do, and not only because it obviates the education. The truth isn’t anticorrelated with the school you go to, either!

Even if everyone collectively attempted this stubbornness only to the exact degree needed to remove the statistical connection between teachers’ views and their students’, it’s still not truth-tracking. An analogy: suppose you give a standardized English language test, determine that Hispanics are doing disproportionately well on it, figure out that this is because many speak Romance languages and do well with Latinate words, and deflate Hispanic scores to even out the demographics of the test results. This might give you a racially balanced outcome, but on an individual level, it will unfairly hurt some monolingual Anglophone Hispanics, and help some Francophone test-takers—it will not do as much as you’d hope to improve the skill-tracking ability of the test. Similarly, flattening the impact of teaching on student views won’t salvage truth-tracking of student views as though this trend never existed; it’ll just yield the same high-level statistics you’d get if that bias weren’t operating.

Lots of biases still live in your head doing their thing even when you know about them. This one, though, puts you in an awfully weird epistemic situation. It’s almost like the opposite of belief in beliefdisbelief in belief. “This is true, but my situation made me more prone than I should have been to believe it and my belief is therefore suspect. But dang, that argument my teacher explained to me sure was sound-looking! I must just be lucky—those poor saps with other teachers have it wrong! But of course I would think that...”

It is possible, to an extent, to reduce the risk here—you can surround yourself with cognitively diverse peers and teachers, even if only in unofficial capacities. But even then, who you spend the most time with, whom you get along with best, whose style of thought “clicks” most with yours, and—due to competing biases—whoever agrees with you already will have more of an effect than the others. In practice, you can’t sit yourself in a controlled environment and expose yourself to pure and perfect argument and evidence (without allowing accidental leanings to creep in via the order in which you read it, either).

I’m not even sure if it’s right to assign a higher confidence to beliefs that you happen to have maintained—absent special effort—in contravention of the general agreement. It seems to me that people have trains of thought that just seem more natural to them than others. (Was I the only one disconcerted by Eliezer announcing high confidence in Bayesianism in the same post as a statement that he was probably “born that way”?) This isn’t even a highly reliable way for you to learn things about yourself, let alone the rest of the world: unless there’s a special reason your intuitions—and not those of people who think differently—should be truth-tracking, these beliefs are likely to represent where your brain just happens to clamp down really hard on something and resist group pressure and that inexorable slide.