Inferential Distance between two people with respect to an item of knowledge is the amount of steps or concepts a person needs to share before they can successfully communicate the object level point. This can be thought of as the missing foundation or building block concepts needed to think clearly about a specific thing.
In Expecting Short Inferential Distances, Eliezer Yudkowsky posits that humans systematically underestimate inferential distances.
And if you think you can explain the concept of “systematically underestimated inferential distances” briefly, in just a few words, I’ve got some sad news for you . . . – Expecting Short Inferential Distances
Example: Evidence for Evolution
Explaining the evidence for the theory of evolution to a physicist would be easy; even if the physicist didn’t already know about evolution, they would understand the concepts of evidence, Occam’s razor, naturalistic explanations, and the general orderly nature of the universe. Explaining the evidence for the theory of evolution to someone without a science background would be much harder. Before even mentioning the specific evidence for evolution, you would have to explain the concept of evidence, why some kinds of evidence are more valuable than others, what does and doesn’t count as evidence, and so on. This would be unlikely to work during a short conversation.
There is a short inferential distance between you and the physicist; there is a very long inferential distance between you and the person without any science background. Many members of Less Wrong believe that expecting short inferential distances is a classic error. It is also a very difficult problem to solve, since most people will feel offended if you explicitly say that there is too great an inferential distance between you to explain a theory properly. Some people have attempted to explain this through evolutionary psychology: in the ancestral environment, there was minimal difference in knowledge between people, and therefore no need to worry about inferential distances.
Why It’s Hard to Explain Things: Inferential Distance by Peter Hurford
How all human communication fails, except by accident, or a commentary of Wiio’s laws
From the old discussion page:
The current example seems to set up the LWer as the physicist, and the newcomer as the fool who doesn’t know anything about science. Could we potentially switch out the evolution example for something everyone will probably be on board with? Like: imagine a chef trying to explain to someone who doesn’t know how to cook the particularities of making a very fancy dish. It would at least be nice to have at least one such friendly example, since I think this entry is linked to a lot by LWers to others.