Your Strength as a Rationalist

The fol­low­ing hap­pened to me in an IRC cha­t­room, long enough ago that I was still hang­ing around in IRC cha­t­rooms. Time has fuzzed the mem­ory and my re­port may be im­pre­cise.

So there I was, in an IRC cha­t­room, when some­one re­ports that a friend of his needs med­i­cal ad­vice. His friend says that he’s been hav­ing sud­den chest pains, so he called an am­bu­lance, and the am­bu­lance showed up, but the paramedics told him it was noth­ing, and left, and now the chest pains are get­ting worse. What should his friend do?

I was con­fused by this story. I re­mem­bered read­ing about home­less peo­ple in New York who would call am­bu­lances just to be taken some­place warm, and how the paramedics always had to take them to the emer­gency room, even on the 27th iter­a­tion. Be­cause if they didn’t, the am­bu­lance com­pany could be sued for lots and lots of money. Like­wise, emer­gency rooms are legally obli­gated to treat any­one, re­gard­less of abil­ity to pay.1 So I didn’t quite un­der­stand how the de­scribed events could have hap­pened. Any­one re­port­ing sud­den chest pains should have been hauled off by an am­bu­lance in­stantly.

And this is where I fell down as a ra­tio­nal­ist. I re­mem­bered sev­eral oc­ca­sions where my doc­tor would com­pletely fail to panic at the re­port of symp­toms that seemed, to me, very alarm­ing. And the Med­i­cal Estab­lish­ment was always right. Every sin­gle time. I had chest pains my­self, at one point, and the doc­tor pa­tiently ex­plained to me that I was de­scribing chest mus­cle pain, not a heart at­tack. So I said into the IRC chan­nel, “Well, if the paramedics told your friend it was noth­ing, it must re­ally be noth­ing—they’d have hauled him off if there was the tiniest chance of se­ri­ous trou­ble.”

Thus I man­aged to ex­plain the story within my ex­ist­ing model, though the fit still felt a lit­tle forced . . .

Later on, the fel­low comes back into the IRC cha­t­room and says his friend made the whole thing up. Ev­i­dently this was not one of his more re­li­able friends.

I should have re­al­ized, per­haps, that an un­known ac­quain­tance of an ac­quain­tance in an IRC chan­nel might be less re­li­able than a pub­lished jour­nal ar­ti­cle. Alas, be­lief is eas­ier than dis­be­lief; we be­lieve in­stinc­tively, but dis­be­lief re­quires a con­scious effort.2

So in­stead, by dint of mighty strain­ing, I forced my model of re­al­ity to ex­plain an anomaly that never ac­tu­ally hap­pened. And I knew how em­bar­rass­ing this was. I knew that the use­ful­ness of a model is not what it can ex­plain, but what it can’t. A hy­poth­e­sis that for­bids noth­ing, per­mits ev­ery­thing, and thereby fails to con­strain an­ti­ci­pa­tion.

Your strength as a ra­tio­nal­ist is your abil­ity to be more con­fused by fic­tion than by re­al­ity. If you are equally good at ex­plain­ing any out­come, you have zero knowl­edge.

We are all weak, from time to time; the sad part is that I could have been stronger. I had all the in­for­ma­tion I needed to ar­rive at the cor­rect an­swer, I even no­ticed the prob­lem, and then I ig­nored it. My feel­ing of con­fu­sion was a Clue, and I threw my Clue away.

I should have paid more at­ten­tion to that sen­sa­tion of still feels a lit­tle forced. It’s one of the most im­por­tant feel­ings a truth­seeker can have, a part of your strength as a ra­tio­nal­ist. It is a de­sign flaw in hu­man cog­ni­tion that this sen­sa­tion man­i­fests as a quiet strain in the back of your mind, in­stead of a wailing alarm siren and a glow­ing neon sign read­ing:

Ether Your Model Is False Or This Story Is Wrong.

1 And the hos­pi­tal ab­sorbs the costs, which are enor­mous, so hos­pi­tals are clos­ing their emer­gency rooms . . . It makes you won­der what’s the point of hav­ing economists if we’re just go­ing to ig­nore them.

2 From McCluskey (2007), “Truth Bias”: “[P]eo­ple are more likely to cor­rectly judge that a truth­ful state­ment is true than that a lie is false. This ap­pears to be a fairly ro­bust re­sult that is not just a func­tion of truth be­ing the cor­rect guess where the ev­i­dence is weak—it shows up in con­trol­led ex­per­i­ments where sub­jects have good rea­son not to as­sume truth[.]” http://​​www.over­com­ing­bias.com/​​2007/​​08/​​truth-bias.html .

And from Gilbert et al. (1993), “You Can’t Not Believe Every­thing You Read”: “Can peo­ple com­pre­hend as­ser­tions with­out be­liev­ing them? [...] Three ex­per­i­ments sup­port the hy­poth­e­sis that com­pre­hen­sion in­cludes an ini­tial be­lief in the in­for­ma­tion com­pre­hended.”