In­fer­en­tial Distance

TagLast edit: 9 Feb 2021 8:01 UTC by Yoav Ravid

The Inferential Distance distance between Person A and Person B on a given subject is the number of steps of inference required for A to reach B’s conclusion.

In Expecting Short Inferential Distances, Eliezer Yudkowsky posits that humans systematically underestimate inferential distances.

And if you think you can explain the concept of “systematically underestimated inferential distances” briefly, in just a few words, I’ve got some sad news for you . . . – Expecting Short Inferential Distances

Example: Evidence for Evolution

Explaining the evidence for the theory of evolution to a physicist would be easy; even if the physicist didn’t already know about evolution, they would understand the concepts of evidence, Occam’s razor, naturalistic explanations, and the general orderly nature of the universe. Explaining the evidence for the theory of evolution to someone without a science background would be much harder. Before even mentioning the specific evidence for evolution, you would have to explain the concept of evidence, why some kinds of evidence are more valuable than others, what does and doesn’t count as evidence, and so on. This would be unlikely to work during a short conversation.

There is a short inferential distance between you and the physicist; there is a very long inferential distance between you and the person without any science background. Many members of Less Wrong believe that expecting short inferential distances is a classic error. It is also a very difficult problem to solve, since most people will feel offended if you explicitly say that there is too great an inferential distance between you to explain a theory properly. Some people have attempted to explain this through evolutionary psychology: in the ancestral environment, there was minimal difference in knowledge between people, and therefore no need to worry about inferential distances.

External Links

See Also

Ex­pect­ing Short In­fer­en­tial Distances

Eliezer Yudkowsky22 Oct 2007 23:42 UTC
216 points
106 comments3 min readLW link

Dou­ble Illu­sion of Transparency

Eliezer Yudkowsky24 Oct 2007 23:06 UTC
82 points
32 comments3 min readLW link

Ex­pan­sive trans­la­tions: con­sid­er­a­tions and possibilities

ozziegooen18 Sep 2020 15:39 UTC
43 points
15 comments6 min readLW link

An In­tu­itive Ex­pla­na­tion of In­fer­en­tial Distance

RichardJActon26 Nov 2017 14:13 UTC
13 points
6 comments3 min readLW link

Start­ing point for calcu­lat­ing in­fer­en­tial dis­tance?

JenniferRM3 Dec 2010 20:20 UTC
22 points
9 comments2 min readLW link

Com­plex­ity: in­her­ent, cre­ated, and hidden

Swimmer96314 Sep 2011 14:33 UTC
9 points
49 comments4 min readLW link

Karate Kid and Real­is­tic Ex­pec­ta­tions for Disagree­ment Resolution

Raemon4 Dec 2019 23:25 UTC
80 points
23 comments4 min readLW link

Bridg­ing In­fer­en­tial Gaps

atucker8 Dec 2010 4:50 UTC
13 points
20 comments2 min readLW link

In­fer­en­tial silence

Kaj_Sotala25 Sep 2013 12:45 UTC
77 points
58 comments1 min readLW link

Illu­sion of Trans­parency: Why No One Un­der­stands You

Eliezer Yudkowsky20 Oct 2007 23:49 UTC
115 points
51 comments3 min readLW link

Great minds might not think alike

UnexpectedValues26 Dec 2020 19:51 UTC
246 points
43 comments11 min readLW link

Write a Thou­sand Roads to Rome

Screwtape8 Feb 2018 18:09 UTC
72 points
13 comments4 min readLW link

The Typ­i­cal Sex Life Fallacy

ozymandias7 Oct 2017 21:48 UTC
27 points
293 comments5 min readLW link

Un­der­stand­ing is translation

cousin_it28 May 2018 13:56 UTC
87 points
23 comments1 min readLW link

Refram­ing mis­al­igned AGI’s: well-in­ten­tioned non-neu­rotyp­i­cal assistants

zhukeepa1 Apr 2018 1:22 UTC
46 points
14 comments2 min readLW link

Ex­pected Pain Parameters

Alicorn14 Jul 2018 19:30 UTC
83 points
12 comments2 min readLW link

The Power to Teach Con­cepts Better

Liron23 Sep 2019 0:21 UTC
85 points
22 comments8 min readLW link2 nominations1 review

Zetetic explanation

Benquo27 Aug 2018 0:12 UTC
87 points
138 comments6 min readLW link

In My Culture

Duncan_Sabien7 Mar 2019 7:22 UTC
56 points
58 comments1 min readLW link2 nominations2 reviews

LW Women- Min­i­miz­ing the In­fer­en­tial Distance

daenerys25 Nov 2012 23:33 UTC
96 points
1,262 comments7 min readLW link

Be­ing a teacher

Swimmer96314 Mar 2011 20:03 UTC
78 points
154 comments3 min readLW link

Typ­i­cal Mind and Politics

Scott Alexander12 Jun 2009 12:28 UTC
55 points
132 comments6 min readLW link

Dominus’ Razor

badger26 May 2011 1:05 UTC
60 points
27 comments1 min readLW link

Over­com­ing the Curse of Knowledge

JesseGalef18 Oct 2011 17:39 UTC
58 points
56 comments3 min readLW link

Teach­ing the Unteachable

Eliezer Yudkowsky3 Mar 2009 23:14 UTC
49 points
18 comments6 min readLW link

False Friends and Tone Policing

palladias18 Jun 2014 18:20 UTC
72 points
49 comments3 min readLW link

A few analo­gies to illus­trate key ra­tio­nal­ity points

kilobug9 Oct 2011 13:00 UTC
68 points
52 comments5 min readLW link

Un­teach­able Excellence

Eliezer Yudkowsky2 Mar 2009 15:33 UTC
37 points
40 comments2 min readLW link

Take heed, for it is a trap

Zed14 Aug 2011 10:23 UTC
54 points
189 comments6 min readLW link

The Sally-Anne fallacy

philh11 Apr 2016 13:06 UTC
51 points
30 comments1 min readLW link

In­fer­en­tial credit history

RyanCarey24 Jul 2013 14:12 UTC
56 points
36 comments3 min readLW link

Men­tal Metadata

Alicorn30 Mar 2011 3:07 UTC
41 points
35 comments3 min readLW link

Map­ping Another’s Universe

squidious17 Nov 2017 2:37 UTC
11 points
5 comments3 min readLW link

Why do peo­ple ____?

magfrump4 May 2012 4:20 UTC
36 points
256 comments1 min readLW link

Blind Goal­ten­ders: Un­pro­duc­tive Disagreements

PDV28 Sep 2017 16:19 UTC
18 points
8 comments2 min readLW link

Gen­er­al­iz­ing From One Example

Scott Alexander28 Apr 2009 22:00 UTC
350 points
412 comments6 min readLW link

Tak­ing the Out­group Seriously

Davis_Kingsley16 Feb 2020 13:23 UTC
21 points
8 comments2 min readLW link

The Best Text­books on Every Subject

lukeprog16 Jan 2011 8:30 UTC
423 points
373 comments7 min readLW link

Men­tal Crystallography

Alicorn27 Feb 2010 1:04 UTC
27 points
57 comments2 min readLW link

De­bug­ging the student

adamzerner16 Dec 2020 7:07 UTC
38 points
7 comments4 min readLW link

[Link] Thick and thin

[deleted]6 Jun 2012 12:08 UTC
39 points
12 comments2 min readLW link