The illusion of explanatory depth (Rozenblit & Keil, 2002) seems like a particularly relevant example of that other side of the coin. If you ask people if they understand how something works, like a bicycle, a flush toilet, or a zipper, they’ll generally say that, yes, they understand it and could explain it. But if you ask them to draw a diagram and actually explain it, they’ll often get it wrong, and realize in the process that they don’t understand it as well as they thought they did. The main problem seems to be that people have higher-level understanding of the object, and experience using it correctly, which they confuse with a more in-depth knowledge of the mechanisms that make it work.
That doesn’t necessarily contradict AnnaSalamon’s point about stopping because of learned blankness. Seeing something stop working, and not immediately knowing why it messed up or how to fix it, might be enough to trigger that same lack of confidence that shows up after people try and fail to explain how something works. And in order to fix it you often don’t need so much depth of knowledge. Even if you don’t have enough knowledge to fully explain the mechanism that makes something work, you still might know enough to identify and fix this particular problem, especially since you have the thing right there to look at, think about, and play around with.
Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science, 26, 521-562. pdf
Wow. That article is pure gold: the kinds of mistaken explanations they talk about are exactly what I hear from people who give unhelpful explanations—they don’t see the limits of their own understanding of the phenomenon, and obviously can’t convey what they lack. And so any explanation they give is thus extremely brittle, as they can’t do much more than swap in other terms for the mysterious concepts they invoke.
(This is not to say they’re completely unhelpful—a partial explanation is better than none at all. But in that case, it’s preferable that you clarify that your understanding is indeed limited, and can’t connect it to a broader understanding of the world.)
This is why study groups work (if you use them properly). Explaining something to someone else makes you think about it much more clearly. Finding out you don’t know about something when they ask shows holes in your knowledge.
I think that being able to clearly explain something is the mark of someone truly understanding it.
The illusion of explanatory depth (Rozenblit & Keil, 2002) seems like a particularly relevant example of that other side of the coin. If you ask people if they understand how something works, like a bicycle, a flush toilet, or a zipper, they’ll generally say that, yes, they understand it and could explain it. But if you ask them to draw a diagram and actually explain it, they’ll often get it wrong, and realize in the process that they don’t understand it as well as they thought they did. The main problem seems to be that people have higher-level understanding of the object, and experience using it correctly, which they confuse with a more in-depth knowledge of the mechanisms that make it work.
That doesn’t necessarily contradict AnnaSalamon’s point about stopping because of learned blankness. Seeing something stop working, and not immediately knowing why it messed up or how to fix it, might be enough to trigger that same lack of confidence that shows up after people try and fail to explain how something works. And in order to fix it you often don’t need so much depth of knowledge. Even if you don’t have enough knowledge to fully explain the mechanism that makes something work, you still might know enough to identify and fix this particular problem, especially since you have the thing right there to look at, think about, and play around with.
Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science, 26, 521-562. pdf
Wow. That article is pure gold: the kinds of mistaken explanations they talk about are exactly what I hear from people who give unhelpful explanations—they don’t see the limits of their own understanding of the phenomenon, and obviously can’t convey what they lack. And so any explanation they give is thus extremely brittle, as they can’t do much more than swap in other terms for the mysterious concepts they invoke.
(This is not to say they’re completely unhelpful—a partial explanation is better than none at all. But in that case, it’s preferable that you clarify that your understanding is indeed limited, and can’t connect it to a broader understanding of the world.)
This is why study groups work (if you use them properly). Explaining something to someone else makes you think about it much more clearly. Finding out you don’t know about something when they ask shows holes in your knowledge.
I think that being able to clearly explain something is the mark of someone truly understanding it.
Edit—please disregard this post