Telling teenagers that you have to carefully explain the idea on a level accessible to your audience, and perhaps even then the audience would not understand if they lack some important knowledge or experience, hm.
I think this idea is worth telling at the beginning, but of course, in a best accessible way, and shortly. My preferred way is to describe an ancient setting (for less mindkilling, don’t even mention evolution, just say “hunters in a jungle”) where any knowledge is easily transferred. If someone says “there are gazelles near the river”, everyone knows what “gazelle” means, and what “river” means. In our society, if you pick a scientific journal from a field you don’t understand, you probably won’t understand the articles. And yet it feels like we should understand everything quickly. This is an example of a bias, we call it “expecting short inferential distances”. (Now move on to other biases.)
Illusion of transparency is thinking that contents of MIND1 and MIND2 must be similar, ignoring that MIND2 does not have information that strongly influences how MIND1 thinks.
Expecting short inferential distances is underestimating the vertical complexity (information that requires knowledge of other information) of a MAP.
EDIT: I don’t know if there is a standard name for this, and it would not surprise me if there isn’t. Seems to me that most biases are about how minds work and communicate, while “inferential distances” is about maps that did not exist in ancient environment.
I think this idea is worth telling at the beginning, but of course, in a best accessible way, and shortly. My preferred way is to describe an ancient setting (for less mindkilling, don’t even mention evolution, just say “hunters in a jungle”) where any knowledge is easily transferred. If someone says “there are gazelles near the river”, everyone knows what “gazelle” means, and what “river” means. In our society, if you pick a scientific journal from a field you don’t understand, you probably won’t understand the articles. And yet it feels like we should understand everything quickly. This is an example of a bias, we call it “expecting short inferential distances”. (Now move on to other biases.)
“Inferential distance” is LW jargon. Does the bias have a standard name?
Illusion of transparency?
Illusion of transparency is thinking that contents of MIND1 and MIND2 must be similar, ignoring that MIND2 does not have information that strongly influences how MIND1 thinks.
Expecting short inferential distances is underestimating the vertical complexity (information that requires knowledge of other information) of a MAP.
EDIT: I don’t know if there is a standard name for this, and it would not surprise me if there isn’t. Seems to me that most biases are about how minds work and communicate, while “inferential distances” is about maps that did not exist in ancient environment.