Although already three commenters suggest talking about inferential distances, I am afraid that it is quite hard to make such presentation both interesting and believable. Telling teenagers that you have to carefully explain the idea on a level accessible to your audience, and perhaps even then the audience would not understand if they lack some important knowledge or experience, hm. The audience (the real one at the presentation, not the one spoken about) would interpret it either as banal “it is useless to try to explain things to idiots”, or as simple bragging along the lines “I am so smart that you have to study a lot to understand me”. When I was 16, if somebody told me that creationists might not accept my argument not because their fanaticism and stupidity but because some “inferential distance”, I would think he secretly sides with the creationists. There is little appreciation for subtleties in that age.
If I have to suggest a topic, take something simple, relatively non-controversial and easy to explain. Pick one or more biases or fallacies and present them together with realistic illustrational examples. Base rate fallacy may work fine. Take some quasi-realistic example, such as cancer testing or court trial, which your audience would consider important. Make them guess the answer—that will make it interactive and therefore more interesting. After they get it wrong (they reliably will) show the right answer, which makes a surprising point. You have all ingredients for a good talk.
Telling teenagers that you have to carefully explain the idea on a level accessible to your audience, and perhaps even then the audience would not understand if they lack some important knowledge or experience, hm.
I think this idea is worth telling at the beginning, but of course, in a best accessible way, and shortly. My preferred way is to describe an ancient setting (for less mindkilling, don’t even mention evolution, just say “hunters in a jungle”) where any knowledge is easily transferred. If someone says “there are gazelles near the river”, everyone knows what “gazelle” means, and what “river” means. In our society, if you pick a scientific journal from a field you don’t understand, you probably won’t understand the articles. And yet it feels like we should understand everything quickly. This is an example of a bias, we call it “expecting short inferential distances”. (Now move on to other biases.)
Illusion of transparency is thinking that contents of MIND1 and MIND2 must be similar, ignoring that MIND2 does not have information that strongly influences how MIND1 thinks.
Expecting short inferential distances is underestimating the vertical complexity (information that requires knowledge of other information) of a MAP.
EDIT: I don’t know if there is a standard name for this, and it would not surprise me if there isn’t. Seems to me that most biases are about how minds work and communicate, while “inferential distances” is about maps that did not exist in ancient environment.
When I did my presentation before, a substantial fraction of the audience actually got the quasi-trick question about the breast cancer test probability correct.
Excellent suggestion! That’s what I would recommend, too. Mainly because this is the most common mistake presenters of all levels make. Concentrating on it explicitly is likely to make a presentation much more accessible and useful.
Maybe ‘short inferential distances’ would be a good place for you to start?
Although already three commenters suggest talking about inferential distances, I am afraid that it is quite hard to make such presentation both interesting and believable. Telling teenagers that you have to carefully explain the idea on a level accessible to your audience, and perhaps even then the audience would not understand if they lack some important knowledge or experience, hm. The audience (the real one at the presentation, not the one spoken about) would interpret it either as banal “it is useless to try to explain things to idiots”, or as simple bragging along the lines “I am so smart that you have to study a lot to understand me”. When I was 16, if somebody told me that creationists might not accept my argument not because their fanaticism and stupidity but because some “inferential distance”, I would think he secretly sides with the creationists. There is little appreciation for subtleties in that age.
If I have to suggest a topic, take something simple, relatively non-controversial and easy to explain. Pick one or more biases or fallacies and present them together with realistic illustrational examples. Base rate fallacy may work fine. Take some quasi-realistic example, such as cancer testing or court trial, which your audience would consider important. Make them guess the answer—that will make it interactive and therefore more interesting. After they get it wrong (they reliably will) show the right answer, which makes a surprising point. You have all ingredients for a good talk.
I think this idea is worth telling at the beginning, but of course, in a best accessible way, and shortly. My preferred way is to describe an ancient setting (for less mindkilling, don’t even mention evolution, just say “hunters in a jungle”) where any knowledge is easily transferred. If someone says “there are gazelles near the river”, everyone knows what “gazelle” means, and what “river” means. In our society, if you pick a scientific journal from a field you don’t understand, you probably won’t understand the articles. And yet it feels like we should understand everything quickly. This is an example of a bias, we call it “expecting short inferential distances”. (Now move on to other biases.)
“Inferential distance” is LW jargon. Does the bias have a standard name?
Illusion of transparency?
Illusion of transparency is thinking that contents of MIND1 and MIND2 must be similar, ignoring that MIND2 does not have information that strongly influences how MIND1 thinks.
Expecting short inferential distances is underestimating the vertical complexity (information that requires knowledge of other information) of a MAP.
EDIT: I don’t know if there is a standard name for this, and it would not surprise me if there isn’t. Seems to me that most biases are about how minds work and communicate, while “inferential distances” is about maps that did not exist in ancient environment.
When I did my presentation before, a substantial fraction of the audience actually got the quasi-trick question about the breast cancer test probability correct.
More than half?
I don’t remember, but less than half. Maybe a third.
Excellent suggestion! That’s what I would recommend, too. Mainly because this is the most common mistake presenters of all levels make. Concentrating on it explicitly is likely to make a presentation much more accessible and useful.