Why You Should Never Update Your Beliefs

Link post

Epistemic status: Invincible

Since Cavalry scouts are often in direct contact with the enemy, their job can be considered one of the most dangerous jobs the Army has to offer.

something called “Operation Military Kids

There’s some irony that Julia Galef’s rationalist self-help book The Scout Mindset compares favorably the scout, who hunts for new and reliable evidence, to the soldier, who fights off threats. But scouts have one of the most dangerous military occupations. To quote a random website, “cavalry scouts and recon units tread uncharted ground when it comes to conflict zones. They are usually at the tip of any advance and, therefore, meet the brunt of whatever resistance is lying in wait for them.”

Uncharted epistemic territory is dangerous because it’s awash with incorrect arguments which might convince you of their false conclusions. Many of these arguments are designed to be persuasive regardless of their accuracy. Scott Alexander describes succumbing to an “epistemic learned helplessness” after his failure to refute crackpots whose arguments are too carefully crafted to refute in any reasonable length of time:

What finally broke me out wasn’t so much the lucidity of the consensus view so much as starting to sample different crackpots. Some were almost as bright and rhetorically gifted as Velikovsky, all presented insurmountable evidence for their theories, and all had mutually exclusive ideas. After all, Noah’s Flood couldn’t have been a cultural memory both of the fall of Atlantis and of a change in the Earth’s orbit, let alone of a lost Ice Age civilization or of megatsunamis from a meteor strike. So given that at least some of those arguments are wrong and all seemed practically proven, I am obviously just gullible in the field of ancient history. Given a total lack of independent intellectual steering power and no desire to spend thirty years building an independent knowledge base of Near Eastern history, I choose to just accept the ideas of the prestigious people with professorships in Archaeology, rather than those of the universally reviled crackpots who write books about Venus being a comet.

You could consider this a form of epistemic learned helplessness, where I know any attempt to evaluate the arguments is just going to be a bad idea so I don’t even try. If you have a good argument that the Early Bronze Age worked completely differently from the way mainstream historians believe, I just don’t want to hear about it. If you insist on telling me anyway, I will nod, say that your argument makes complete sense, and then totally refuse to change my mind or admit even the slightest possibility that you might be right.

(This is the correct Bayesian action: if I know that a false argument sounds just as convincing as a true argument, argument convincingness provides no evidence either way. I should ignore it and stick with my prior.)

The solution is to ignore most evidence that would change your views. This strategy is well-supported by epistemology and psychology:

  1. Critical thinking is altogether on dubious footing. See Michael Huemer’s “Is Critical Thinking Epistemically Responsible?” (the link goes to his blog post summary; the full text is available at his website in Papers → Epistemology). He discusses the rationality of three strategies for forming a view on a “publicly-discussed issue”:

    ”Credulity: You canvass the opinions of a number of experts, and adopt the belief held by most of them. In the best case, you find a poll of the experts; failing that, you may look through several books and articles and identify their overall conclusions.

    Skepticism: You give up on finding the answer, i.e., immediately suspend judgement.

    Critical Thinking: You gather the arguments and evidence that are available on the issue, from all sides, and assess them for yourself. You try thereby to form some overall impression on the issue. If you form such an impression, you base your belief on that. Otherwise, you suspend judgement.”

    And if you try critical thinking, you’ll either agree with the expert consensus (having wasted your time thinking), disagree with the experts (in which case you’re still more likely than not to be incorrect), or suspend judgment (in which case you’ve both wasted your time and are still likely to be incorrect). Exceptions only exist when the expert class is biased or otherwise unsuitable for deference. It’s better in most cases to avoid thinking for yourself.

  2. Many of the arguments you read are optimized for persuasiveness, which weakens the evidence you get from your failure to refute them. Most people agree that advertising is misleading but are more hesitant about the degree to which arguments in media manipulate you toward the conclusions of motivated authors. Beyond the skill of individual rhetoricians powered by psychological research, algorithmic selection will favor the most convincing appeals from every direction, which limits the amount of signal you receive.

  3. Most of the views you hear aren’t independent at all. In addition to the media, the views you hear from friends or in conversation won’t be independent, especially if they’re all within a similar social circle. In some cases, you can hear the same position over and over from different people who all sourced it from the same author or from each other. The psychologists tell us that mere repetition is one of the strongest persuasive techniques and it’s easy to avoid accounting for this as you watch your views gradually approach those of your new social groups.

  4. Changing your beliefs takes cognitive effort and makes your behavior less predictable. If you’re changing your views, people won’t know where you stand. They won’t know if you’ll hold the same opinions tomorrow that you hold today. You’ll be less able to make long-term commitments and you’ll spend a lot of cognitive effort evaluating arguments that could be spent blogging or building computer software.

The prescription:

  1. Don’t take ideas seriously. Disagree with them even without any arguments in your favor.

  2. Don’t change your views when you hear counterarguments. Just keep the same view as you had before, especially if you’re unlikely to be hearing an independent opinion.

  3. Avoid having “strong opinions, weakly held.” Instead, hold weak opinions but don’t change them easily.

Disclaimer: This post’s title represents the opposite of my real position, which is that you should sometimes update your beliefs.