Why is it so hard to change people’s minds? Well, imagine if it wasn’t...

Epistemic status: Tying together the great works of others into something less great

I think the mind’s defenses against change in its beliefs are a form of anti-parasitism.

Society commonly bemoans the difficulty in changing people’s minds. We wish we could change the minds of our friends and family about all sorts of issues: vaccines, policy issues, religious beliefs or lack thereof, and on and on.

We struggle to convince ourselves of things, too. Diet, exercise, sleep, laziness or workaholism. We make the same New Year’s Resolutions, year in and year out, only to drop them after a week or two, just like every year past.

When we try to change someone’s mind, even our own, we do so in a remarkably useless way. If we’re not flinging angry insults on Facebook, at best we’re running down the same old list of reasons to believe X that they’ve already heard, processed, and created counterexamples for years ago. When we try to change our own minds, we try pumping ourselves up, insulting ourselves, or just declaring Today to be The Day that Things Change™.

I’ve been in more arguments and debates than I care to count. They usually don’t lead anywhere: I say my side, they say theirs, and we both walk away even more convinced of our respective correctness. I’ve been the recipient of much well-meaning advice over the years, too, but my mind is excellent at making up counterexamples or not being swayed by the words (most of which are unchanged since I first heard them). And, of course, I’ve told myself many things about how to live and what to do. Sometimes I even believe my mind has finally integrated them.

In some sick meta joke, our belief that mere arguments and debates and advice can change minds is itself resistant to change. We get caught in the same old arguments that go exactly the same way. We use the same old talking points, believing subconsciously that this time, they’ll surely work. Our debates become mere reflex, yet we never question our methods when they invariably fail again.

The mind’s defenses are powerful. If the part of the brain that is able to integrate new information is damaged, we see a module of the brain completely take over and spin up arbitrary stories defending pre-existing beliefs.

Even in healthy people, our beliefs color our perception. Any ambiguity is resolved, automatically, subconsciously, in favor of what we already know. The misanthrope believes that the brusque person was angry at them specifically, the anxious person believes that the laughter of the crowd is at the way they look… everything neatly resolves to the map we already carry.

Why is it so hard to change a person’s mind?

Well… what would the world be like if it was easy?

Stubbornness of Beliefs as a Form of Anti-Parasitism

Imagine that a well-reasoned, seemingly airtight argument in favor of some position was all that was necessary to change a person’s mind. Or that a strong appeal to emotion worked just as well.

We’d all be happy slaves of whatever cult first rose up. We’d meet with a charismatic leader who knew all the right emotional buttons to push, backed up those emotional statements with solid-looking data that addressed many common counterarguments, and we’d be eating out of the palm of his hand.[1]

Bad beliefs, beliefs that cause you to serve others without your consent, are parasites, no better than the fungus that takes over an insect’s brain.

If beliefs were easy to change, people would be able to trivially recruit anyone to their cause. This would cause a huge negative selection pressure—people who have been trivially recruited into a cause not their own are also much more likely to die, or at least not reproduce.

That is, if an evil, lying leader can recruit people with a minor amount of effort, individuals become much less valuable. Tending to their needs and their goals becomes much less important to the leader, and I feel like that would lead to evolution making minds harder to change as all the gullible people had long been neglected to death.

We see this in real cults. A cult cannot recruit people in one seminar or a few minutes with a charismatic figure. Cults slowly, systematically break you apart from your normal life, friends, and family. They subject you to months or years of psychological torment. The damage they do can take just as long to unlearn as it took to learn. The mind’s defenses are formidable.

This is Gonna Hurt

So, what does change people’s minds? Few of us are cult leaders, and many of us have beliefs in ourselves or others that really would be better off abandoned. Consider the poor agoraphobic who spends most of their life in their room, or the sick man who doesn’t trust doctors. Consider ourselves: over- or underweight, exercising more or less than we’d like to, caught in relationships that we don’t like, or too afraid to take risks in new jobs or new cities.

Well, words don’t work. Adam Mastroianni has an excellent post on this very topic over on his Substack. Not only does he talk about the brain’s natural defense against words, he also talks about how abysmally lossy words are in describing human experience. It’s a really good piece, definitely one of my personal favorites among the entire Rationality project.

(You may fairly ask “if words don’t work, why do we use them?”. Scott Alexander answers that via XKCD: we seem to have an instinct to defend our beliefs, and words is one of the ways we do that. It seems reflexive, and it makes us feel better, even if we don’t actually change anyone’s mind.)

The brain is malleable, although you may not like the results. Scott has also written about using psychedelics to lower the mind’s defenses against belief change. Positive anecdotes abound: people’s long-standing ruminations were immediately thrown into the light and dissolved in their own obvious wrongness like mist. Unfortunately, lowering the brain’s defenses doesn’t just work for true beliefs but also for false ones. People, per Scott, would sometimes walk away with weird new beliefs that also didn’t match the territory.

I think there’s only one thing that actually works. Moments when the territory and the map don’t match. Moments where we encounter something that our beliefs say should be extremely rare. Moments that have such low Bayesian priors on their occurrence that we are shocked. Moments so obviously correct that our brain doesn’t even try to defend existing beliefs. Failing that, a constant stream of weaker evidence, like when Daryl Davis convinced 200 KKK members to leave the group over the course of many years.

It has to be real, Bayesian evidence, not second- or third-hand accounts. It has to be something they can see with their own eyes.[2] And it either has to be extremely strong Bayesian evidence, or a lot of it, over a long period of time. And you have to, if you will, dance gracefully with your partner, leaving enough space and time for them to process what they’ve seen. Emotions like defensiveness can be poisonous here.

This is not easy, and the evidence cannot be generated on demand, otherwise we run into the same parasitism problem.

Is It Any Easier to Convince Myself?

Kind of, yeah. You are the closest thing to an administrator of your mind that exists, even if your control isn’t as good as it is over your computer. You benefit from being part of your rich internal experience, the one you can’t see in someone else.

I think that changing your own mind is much more understanding it as it is overwriting it. Methods like Cognitive Behavioral Therapy, a lot of stuff here on LessWrong (personal favorite: Multiagent Models of Mind by Kaj Sotala), and meditation help you to notice the deeper processes of your mind as a separate thing from your conscious experience.

I think a human starts out as holding their beliefs as equivalent to the way the world really is—they equate the map and the territory. If, say, you don’t like a given movie, that means it’s objectively bad, and those who like it are Wrong! CBT and meditation help give you a small moment to realize that your belief is a product of your brain, and a moment to see if it actually applies to the world. It’s like turning on notices for your thoughts.

Hopefully you have some beliefs to change in mind. Maybe you want to think of the world as a safer place. Maybe you want to believe that people are nice and kind. Maybe you want to believe that junk food is bad for you, or believe that exercise can be useful and enjoyable.

The next step is to try stuff. Gather the evidence, both in favor of your desired beliefs, and against it. Actually figure out what the territory is, as best you can. If people in your town actually are all jerks, it would be good to believe that! Your desired belief may become “I need to move out of here”.

One trick is to write down your hypotheses before heading out. If you want to be more sociable, but your mind believes “I can’t talk to anyone”, write that as your hypothesis. Then you can compare that to what actually happened. It’s important to write this down before you leave, as your brain will forget what it predicted earlier and will fit whatever experience you had into your unwanted belief.

It may also help to try the methods used in convincing someone else. Your brain has hidden depths and it can be helpful to approach it as a sort of conversation. Why does your brain have this belief? What caused it? Why is it so strong? If the belief were to be challenged, what part of us would feel particularly unsafe? Sotala’s excellent Sequence linked above covers that in much more detail.

Changing your own mind is not a matter of overwriting some text in a book. It’s a careful, graceful dance through map and territory alike, one that might end up in an entirely different place than you first expected. This is not trivial.

  1. ^

    I’m not talking about Eliezer or anyone in particular! This is not an essay about how Rationalism, or any other group, is especially gullible! This is an essay about why gullibility seems like it should be a form of negative selection pressure!

  2. ^

    Reading this made me realize another possible reason why changing minds is hard: the Argument from Low-Hanging Fruit. Easy-to-change beliefs are probably already changed, leaving only the really tough ones left.