Truth: It’s Not That Great

Ra­tion­al­ity is pretty great. Just not quite as great as ev­ery­one here seems to think it is.

-Yvain, “Ex­treme Ra­tion­al­ity: It’s Not That Great”

The folks most vo­cal about lov­ing “truth” are usu­ally sel­l­ing some­thing. For preach­ers, dem­a­gogues, and sales­men of all sorts, the wilder their story, the more they go on about how they love truth...

The peo­ple who just want to know things be­cause they need to make im­por­tant de­ci­sions, in con­trast, usu­ally say lit­tle about their love of truth; they are too busy try­ing to figure stuff out.

-Robin Han­son, “Who Loves Truth Most?”

A cou­ple weeks ago, Brienne made a post on Face­book that in­cluded this re­mark: “I’ve also gained a lot of rev­er­ence for the truth, in virtue of the cen­tral­ity of truth-seek­ing to the fate of the galaxy.” But then she ed­ited to add a foot­note to this sen­tence: “That was the jus­tifi­ca­tion my brain origi­nally threw at me, but it doesn’t ac­tu­ally quite feel true. There’s some­thing more di­rectly re­spon­si­ble for the mo­ti­va­tion that I haven’t yet iden­ti­fied.”

I saw this, and com­mented:

<puts rub­ber Robin Han­son mask on>

What we have here is a case of sub­cul­tural in-group sig­nal­ing mas­querad­ing as some­thing else. In this case, pro­claiming how vi­tally im­por­tant truth-seek­ing is is a mark of your sub­cul­ture. In re­al­ity, the truth is some­times re­ally im­por­tant, but some­times it isn’t.

</​rub­ber Robin Han­son mask>

In spite of the dis­tanc­ing pseudo-HTML tags, I ac­tu­ally be­lieve this. When I read some of the more ex­treme procla­ma­tions of the value of truth that float around the ra­tio­nal­ist com­mu­nity, I sus­pect peo­ple are do­ing in-group sig­nal­ing—or per­haps con­flat­ing their own idiosyn­cratic prefer­ences with ra­tio­nal­ity. As a mild an­ti­dote to this, when you hear some­one talk­ing about the value of the truth, try see­ing if the state­ment still makes sense if you re­place “truth” with “in­for­ma­tion.”

This stan­dard gives many state­ments about the value of truth its stamp of ap­proval. After all, in­for­ma­tion is pretty damn valuable. But state­ments like “truth seek­ing is cen­tral to the fate of the galaxy” look a bit sus­pi­cious. Is in­for­ma­tion-gath­er­ing cen­tral to the fate of the galaxy? You could ar­gue that state­ment is kinda true if you squint at it right, but re­ally it’s too gen­eral. Surely it’s not just any in­for­ma­tion that’s cen­tral to shap­ing the fate of the galaxy, but in­for­ma­tion about spe­cific sub­jects, and even then there are trade­offs to make.

This is an ex­am­ple of why I sus­pect “effec­tive al­tru­ism” may be bet­ter brand­ing for a move­ment than “ra­tio­nal­ism.” The “ra­tio­nal­ism” brand­ing en­courages the meme that truth-seek­ing is great we should do lots and lots of it be­cause truth is so great. The effec­tive al­tru­ism move­ment, on the other hand, rec­og­nizes that while gath­er­ing in­for­ma­tion about the effec­tive­ness of var­i­ous in­ter­ven­tions is im­por­tant, there are trade­offs to be made be­tween spend­ing time and money on gath­er­ing in­for­ma­tion vs. just do­ing what­ever cur­rently seems likely to have the great­est di­rect im­pact. Rec­og­nize in­for­ma­tion is valuable, but avoid anal­y­sis paral­y­sis.

Or, con­sider state­ments like:

  • Some truths don’t mat­ter much.

  • Peo­ple of­ten have le­gi­t­i­mate rea­sons for not want­ing oth­ers to have cer­tain truths.

  • The value of truth of­ten has to be weighed against other goals.

Do these state­ments sound hereti­cal to you? But what about:

  • In­for­ma­tion can be perfectly ac­cu­rate and also worth­less.

  • Peo­ple of­ten have le­gi­t­i­mate rea­sons for not want­ing other peo­ple to gain ac­cess to their pri­vate in­for­ma­tion.

  • A de­sire for more in­for­ma­tion of­ten has to be weighed against other goals.

I strug­gled to write the first set of state­ments, though I think they’re right on re­flec­tion. Why do they sound so much worse than the sec­ond set? Be­cause the word “truth” car­ries pow­er­ful emo­tional con­no­ta­tions that go be­yond its literal mean­ing. This isn’t just true for ra­tio­nal­ists—there’s a rea­son re­li­gions have say­ings like, “God is Truth” or “I am the way, the truth, and the life.” “God is Facts” or “God is In­for­ma­tion” don’t work so well.

There’s some­thing about “truth”—how it read­ily acts as an ap­plause light, a sa­cred value which must not be traded off against any­thing else. As I type that, a lit­tle voice in me protests “but truth re­ally is sa­cred”… but once we can’t say there’s some limit to how great truth is, hello af­fec­tive death spiral.

Con­sider an­other quote, from Steven Kaas, that I see fre­quently refer­enced on LessWrong: “Pro­mot­ing less than max­i­mally ac­cu­rate be­liefs is an act of sab­o­tage. Don’t do it to any­one un­less you’d also slash their tires, be­cause they’re Nazis or what­ever.” In­ter­est­ingly, the origi­nal blog in­cluded a caveat—”we may have to count ev­ery­day so­cial in­ter­ac­tions as a par­tial ex­cep­tion”—which I never see quoted. That aside, the quote has always bugged me. I’ve never had my tires slashed, but I imag­ine it ru­ins your whole day. On the other hand, hav­ing less than max­i­mally ac­cu­rate be­liefs about some­thing could ruin your whole day, but it could very eas­ily not, de­pend­ing on the topic.

Fur­ther­more, some­times shar­ing cer­tain in­for­ma­tion doesn’t just have lit­tle benefit, it can have sub­stan­tial costs, or at least sub­stan­tial risks. It would se­ri­ously triv­ial­ize Nazi Ger­many’s crimes to com­pare it to the cur­rent US gov­ern­ment, but I don’t think that means we have to pro­mote max­i­mally ac­cu­rate be­liefs about our­selves to the folks at the NSA. Or, when ne­go­ti­at­ing over the price of some­thing, are you re­quired to pro­mote max­i­mally ac­cu­rate be­liefs about the high­est price you’d be will­ing to pay, even if the other party isn’t will­ing to re­cip­ro­cate and may re­spond by de­mand­ing that price?

Pri­vate in­for­ma­tion is usu­ally con­sid­ered pri­vate pre­cisely be­cause it has limited benefit to most peo­ple, but shar­ing it could sig­nifi­cantly harm the per­son whose pri­vate in­for­ma­tion it is. A sen­si­ble ethic around in­for­ma­tion needs to be able to deal with is­sues like that. It needs to be able to deal with ques­tions like: is this in­for­ma­tion that is in the pub­lic in­ter­est to know? And is there a power im­bal­ance in­volved? My rule of thumb is: se­crets kept by the pow­er­ful de­serve ex­tra scrutiny, but so con­versely do their at­tempts to gather other peo­ple’s pri­vate in­for­ma­tion.

“Cor­rupted hard­ware”-type ar­gu­ments can sug­gest you should doubt your own jus­tifi­ca­tions for de­ceiv­ing oth­ers. But par­allel ar­gu­ments sug­gest you should doubt your own jus­tifi­ca­tions for feel­ing en­ti­tled to in­for­ma­tion oth­ers might have le­gi­t­i­mate rea­sons for keep­ing pri­vate. Ar­gu­ments like, “well truth is supremely valuable,” “it’s ex­tremely im­por­tant for me to have ac­cu­rate be­liefs,” or “I’m highly ra­tio­nal so peo­ple should trust me” just don’t cut it.

Fi­nally, be­ing ra­tio­nal in the sense of be­ing well-cal­ibrated doesn’t nec­es­sar­ily re­quire mak­ing truth-seek­ing a ma­jor pri­or­ity. Us­ing the ev­i­dence you have well doesn’t nec­es­sar­ily mean gath­er­ing lots of new ev­i­dence. Often, the al­ter­na­tive to know­ing the truth is not be­liev­ing false­hood, but ad­mit­ting you don’t know and liv­ing with the un­cer­tainty.