Explanations: Ignorance vs. Confusion

When people notice that they don’t understand something, they want an explanation. Normally this works out okay, and we can either find an explanation or gracefully admit our ignorance. To use Richard Feynman’s example, suppose someone asks why their Aunt Minnie is in the hospital. “Because she slipped and fell on the ice and broke her hip” will explain the matter to them, and nobody has to do any weird meta-reasoning about the nature of explanations.

In contrast, consider a deep philosophical question, like why do mirrors flip left and right, but not up and down? (If this is old hat, hopefully you can still remember how it felt. If this is new hat, and you like puzzles, stop and give it a think before reading Brienne on it.) Here, it’s not that we’re ignorant, but that we’re confused. The resolution isn’t like filling in a blank space of ignorance, it’s like putting together a puzzle out of a pile of pieces not all of which match.

Weird meta-reasoning about explanations is somewhat useful in this second kind of puzzle, and is especially useful when you’re not sure whether a puzzle is ignorance or confusion. You want to be able to recognize an explanation when you’ve got one, and most importantly you want to avoid declaring victory prematurely.

If you read my posts on numbers, or if you just really liked Righting a Wrong Question, it will not surprise you that I want to go even more meta. Rather than immediately trying to lay out how to tell if something is explained, we want to start with some nice easy questions about what explanations do inside our heads.


When Aunt Minnie is in the hospital, you already know more or less how the explanation will fit into your map of the world. You’ll take the mental item for “Aunt Minnie in the hospital now,” in your big warehouse of mental representations and associations, and associate it with some events or actions that provide some history for Aunt Minnie getting hospitalized, and then you’ll feel done.

This isn’t how it is in the case of the mirror. At the start you think the mirror flips things left to right, perhaps associated with a visualization of a person standing on the other side of the mirror, because that is a cognitively easy mechanism that generates what you see. But this belief conflicts with an understanding that mirrors are symmetrical—when you notice this conflict and can’t resolve it, you feel confusion. No matter how good a casual story you can tell about how the light bounces off the mirror, the problem won’t go away until you can resolve this conflict.

Even after you read an explanation like Brienne’s about how to understand mirrors in terms of flipping across their mirror plane, you still may feel that sense of confusion. This is because agreeing with verbal arguments isn’t necessarily sufficient to change your mental associations. You associated those wrong mechanisms with the action of a mirror precisely because they’re what’s easy to imagine. With practice (you do some every time the article tells you to imagine something), visualizing the correct action can become easy too, but for a little while you might feel a feeling of ignorance as you have no mechanism tightly associated with mirrors.

These two problems, and their resolutions, vary along a dimension between ignorance and confusion. Ignorance explanations are about forming new associations or telling new stories. Confusion explanations are about resolving conflicting beliefs, or about having to unlearn something intuitive and false to learn something true.

Both ends of the spectrum are still felt by us as the resolution of cognitive tension, and we naively feel like something is explained when the tension goes away. Both ignorance and confusion are present in real people and have to be addressed by most explanations. And while I do think the feelings are different, plenty of people mix them up—as we’ll see in the example of consciousness, later.