Expecting Short Inferential Distances

Homo sapi­ens’s en­vi­ron­ment of evolu­tion­ary adapt­ed­ness (a.k.a. EEA or “an­ces­tral en­vi­ron­ment”) con­sisted of hunter-gath­erer bands of at most 200 peo­ple, with no writ­ing. All in­her­ited knowl­edge was passed down by speech and mem­ory.

In a world like that, all back­ground knowl­edge is uni­ver­sal knowl­edge. All in­for­ma­tion not strictly pri­vate is pub­lic, pe­riod.

In the an­ces­tral en­vi­ron­ment, you were un­likely to end up more than one in­fer­en­tial step away from any­one else. When you dis­cover a new oa­sis, you don’t have to ex­plain to your fel­low tribe mem­bers what an oa­sis is, or why it’s a good idea to drink wa­ter, or how to walk. Only you know where the oa­sis lies; this is pri­vate knowl­edge. But ev­ery­one has the back­ground to un­der­stand your de­scrip­tion of the oa­sis, the con­cepts needed to think about wa­ter; this is uni­ver­sal knowl­edge. When you ex­plain things in an an­ces­tral en­vi­ron­ment, you al­most never have to ex­plain your con­cepts. At most you have to ex­plain one new con­cept, not two or more si­mul­ta­neously.

In the an­ces­tral en­vi­ron­ment there were no ab­stract dis­ci­plines with vast bod­ies of care­fully gath­ered ev­i­dence gen­er­al­ized into el­e­gant the­o­ries trans­mit­ted by writ­ten books whose con­clu­sions are a hun­dred in­fer­en­tial steps re­moved from uni­ver­sally shared back­ground premises.

In the an­ces­tral en­vi­ron­ment, any­one who says some­thing with no ob­vi­ous sup­port is a liar or an idiot. You’re not likely to think, “Hey, maybe this per­son has well-sup­ported back­ground knowl­edge that no one in my band has even heard of,” be­cause it was a re­li­able in­var­i­ant of the an­ces­tral en­vi­ron­ment that this didn’t hap­pen.

Con­versely, if you say some­thing blatantly ob­vi­ous and the other per­son doesn’t see it, they’re the idiot, or they’re be­ing de­liber­ately ob­sti­nate to an­noy you.

And to top it off, if some­one says some­thing with no ob­vi­ous sup­port and ex­pects you to be­lieve it—act­ing all in­dig­nant when you don’t—then they must be crazy.

Com­bined with the illu­sion of trans­parency and self-an­chor­ing (the ten­dency to model other minds as though the were slightly mod­ified ver­sions of one­self), I think this ex­plains a lot about the leg­endary difficulty most sci­en­tists have in com­mu­ni­cat­ing with a lay au­di­ence—or even com­mu­ni­cat­ing with sci­en­tists from other dis­ci­plines. When I ob­serve failures of ex­pla­na­tion, I usu­ally see the ex­plainer tak­ing one step back, when they need to take two or more steps back. Or listen­ers as­sume that things should be visi­ble in one step, when they take two or more steps to ex­plain. Both sides act as if they ex­pect very short in­fer­en­tial dis­tances from uni­ver­sal knowl­edge to any new knowl­edge.

A biol­o­gist, speak­ing to a physi­cist, can jus­tify evolu­tion by say­ing it is the sim­plest ex­pla­na­tion. But not ev­ery­one on Earth has been in­cul­cated with that leg­endary his­tory of sci­ence, from New­ton to Ein­stein, which in­vests the phrase “sim­plest ex­pla­na­tion” with its awe­some im­port: a Word of Power, spo­ken at the birth of the­o­ries and carved on their tomb­stones. To some­one else, “But it’s the sim­plest ex­pla­na­tion!” may sound like an in­ter­est­ing but hardly knock­down ar­gu­ment; it doesn’t feel like all that pow­er­ful a tool for com­pre­hend­ing office poli­tics or fix­ing a bro­ken car. Ob­vi­ously the biol­o­gist is in­fat­u­ated with their own ideas, too ar­ro­gant to be open to al­ter­na­tive ex­pla­na­tions which sound just as plau­si­ble. (If it sounds plau­si­ble to me, it should sound plau­si­ble to any sane mem­ber of my band.)

And from the biol­o­gist’s per­spec­tive, they can un­der­stand how evolu­tion might sound a lit­tle odd at first—but when some­one re­jects evolu­tion even af­ter the biol­o­gist ex­plains that it’s the sim­plest ex­pla­na­tion, well, it’s clear that non­scien­tists are just idiots and there’s no point in talk­ing to them.

A clear ar­gu­ment has to lay out an in­fer­en­tial path­way, start­ing from what the au­di­ence already knows or ac­cepts. If you don’t re­curse far enough, you’re just talk­ing to your­self.

If at any point you make a state­ment with­out ob­vi­ous jus­tifi­ca­tion in ar­gu­ments you’ve pre­vi­ously sup­ported, the au­di­ence just thinks you’re crazy.

This also hap­pens when you al­low your­self to be seen visi­bly at­tach­ing greater weight to an ar­gu­ment than is jus­tified in the eyes of the au­di­ence at that time. For ex­am­ple, talk­ing as if you think “sim­pler ex­pla­na­tion” is a knock­down ar­gu­ment for evolu­tion (which it is), rather than a sorta-in­ter­est­ing idea (which it sounds like to some­one who hasn’t been raised to re­vere Oc­cam’s Ra­zor).

Oh, and you’d bet­ter not drop any hints that you think you’re work­ing a dozen in­fer­en­tial steps away from what the au­di­ence knows, or that you think you have spe­cial back­ground knowl­edge not available to them. The au­di­ence doesn’t know any­thing about an evolu­tion­ary-psy­cholog­i­cal ar­gu­ment for a cog­ni­tive bias to un­der­es­ti­mate in­fer­en­tial dis­tances lead­ing to traf­fic jams in com­mu­ni­ca­tion. They’ll just think you’re con­de­scend­ing.

And if you think you can ex­plain the con­cept of “sys­tem­at­i­cally un­der­es­ti­mated in­fer­en­tial dis­tances” briefly, in just a few words, I’ve got some sad news for you . . .