The Proper Use of Humility

It is widely rec­og­nized that good sci­ence re­quires some kind of hu­mil­ity. What sort of hu­mil­ity is more con­tro­ver­sial.

Con­sider the cre­ation­ist who says: “But who can re­ally know whether evolu­tion is cor­rect? It is just a the­ory. You should be more hum­ble and open-minded.” Is this hu­mil­ity? The cre­ation­ist prac­tices a very se­lec­tive un­der­con­fi­dence, re­fus­ing to in­te­grate mas­sive weights of ev­i­dence in fa­vor of a con­clu­sion they find un­com­fortable. I would say that whether you call this “hu­mil­ity” or not, it is the wrong step in the dance.

What about the en­g­ineer who humbly de­signs fail-safe mechanisms into ma­chin­ery, even though they’re damn sure the ma­chin­ery won’t fail? This seems like a good kind of hu­mil­ity to me. His­tor­i­cally, it’s not un­heard-of for an en­g­ineer to be damn sure a new ma­chine won’t fail, and then it fails any­way.

What about the stu­dent who humbly dou­ble-checks the an­swers on their math test? Again I’d cat­e­go­rize that as good hu­mil­ity. The stu­dent who dou­ble-checks their an­swers wants to be­come stronger; they re­act to a pos­si­ble in­ner flaw by do­ing what they can to re­pair the flaw.

What about a stu­dent who says, “Well, no mat­ter how many times I check, I can’t ever be cer­tain my test an­swers are cor­rect,” and there­fore doesn’t check even once? Even if this choice stems from an emo­tion similar to the emo­tion felt by the pre­vi­ous stu­dent, it is less wise.

You sug­gest study­ing harder, and the stu­dent replies: “No, it wouldn’t work for me; I’m not one of the smart kids like you; nay, one so lowly as my­self can hope for no bet­ter lot.” This is so­cial mod­esty, not hu­mil­ity. It has to do with reg­u­lat­ing sta­tus in the tribe, rather than sci­en­tific pro­cess. If you ask some­one to “be more hum­ble,” by de­fault they’ll as­so­ci­ate the words to so­cial mod­esty—which is an in­tu­itive, ev­ery­day, an­ces­trally rele­vant con­cept. Scien­tific hu­mil­ity is a more re­cent and rar­efied in­ven­tion, and it is not in­her­ently so­cial. Scien­tific hu­mil­ity is some­thing you would prac­tice even if you were alone in a space­suit, light years from Earth with no one watch­ing. Or even if you re­ceived an ab­solute guaran­tee that no one would ever crit­i­cize you again, no mat­ter what you said or thought of your­self. You’d still dou­ble-check your calcu­la­tions if you were wise.

The stu­dent says: “But I’ve seen other stu­dents dou­ble-check their an­swers and then they still turned out to be wrong. Or what if, by the prob­lem of in­duc­tion, 2 + 2 = 5 this time around? No mat­ter what I do, I won’t be sure of my­self.” It sounds very profound, and very mod­est. But it is not co­in­ci­dence that the stu­dent wants to hand in the test quickly, and go home and play video games.

The end of an era in physics does not always an­nounce it­self with thun­der and trum­pets; more of­ten it be­gins with what seems like a small, small flaw . . . But be­cause physi­cists have this ar­ro­gant idea that their mod­els should work all the time, not just most of the time, they fol­low up on small flaws. Usu­ally, the small flaw goes away un­der closer in­spec­tion. Rarely, the flaw widens to the point where it blows up the whole the­ory. There­fore it is writ­ten: “If you do not seek perfec­tion you will halt be­fore tak­ing your first steps.”

But think of the so­cial au­dac­ity of try­ing to be right all the time! I se­ri­ously sus­pect that if Science claimed that evolu­tion­ary the­ory is true most of the time but not all of the time—or if Science con­ceded that maybe on some days the Earth is flat, but who re­ally knows—then sci­en­tists would have bet­ter so­cial rep­u­ta­tions. Science would be viewed as less con­fronta­tional, be­cause we wouldn’t have to ar­gue with peo­ple who say the Earth is flat—there would be room for com­pro­mise. When you ar­gue a lot, peo­ple look upon you as con­fronta­tional. If you re­peat­edly re­fuse to com­pro­mise, it’s even worse. Con­sider it as a ques­tion of tribal sta­tus: sci­en­tists have cer­tainly earned some ex­tra sta­tus in ex­change for such so­cially use­ful tools as medicine and cel­l­phones. But this so­cial sta­tus does not jus­tify their in­sis­tence that only sci­en­tific ideas on evolu­tion be taught in pub­lic schools. Priests also have high so­cial sta­tus, af­ter all. Scien­tists are get­ting above them­selves—they won a lit­tle sta­tus, and now they think they’re chiefs of the whole tribe! They ought to be more hum­ble, and com­pro­mise a lit­tle.

Many peo­ple seem to pos­sess rather hazy views of “ra­tio­nal­ist hu­mil­ity.” It is dan­ger­ous to have a pre­scrip­tive prin­ci­ple which you only vaguely com­pre­hend; your men­tal pic­ture may have so many de­grees of free­dom that it can adapt to jus­tify al­most any deed. Where peo­ple have vague men­tal mod­els that can be used to ar­gue any­thing, they usu­ally end up be­liev­ing what­ever they started out want­ing to be­lieve. This is so con­ve­nient that peo­ple are of­ten re­luc­tant to give up vague­ness. But the pur­pose of our ethics is to move us, not be moved by us.

“Hu­mil­ity” is a virtue that is of­ten mi­s­un­der­stood. This doesn’t mean we should dis­card the con­cept of hu­mil­ity, but we should be care­ful us­ing it. It may help to look at the ac­tions recom­mended by a “hum­ble” line of think­ing, and ask: “Does act­ing this way make you stronger, or weaker?” If you think about the prob­lem of in­duc­tion as ap­plied to a bridge that needs to stay up, it may sound rea­son­able to con­clude that noth­ing is cer­tain no mat­ter what pre­cau­tions are em­ployed; but if you con­sider the real-world differ­ence be­tween adding a few ex­tra ca­bles, and shrug­ging, it seems clear enough what makes the stronger bridge.

The vast ma­jor­ity of ap­peals that I wit­ness to “ra­tio­nal­ist’s hu­mil­ity” are ex­cuses to shrug. The one who buys a lot­tery ticket, say­ing, “But you can’t know that I’ll lose.” The one who dis­be­lieves in evolu­tion, say­ing, “But you can’t prove to me that it’s true.” The one who re­fuses to con­front a difficult-look­ing prob­lem, say­ing, “It’s prob­a­bly too hard to solve.” The prob­lem is mo­ti­vated skep­ti­cism a.k.a. dis­con­fir­ma­tion bias—more heav­ily scru­ti­niz­ing as­ser­tions that we don’t want to be­lieve.1 Hu­mil­ity, in its most com­monly mi­s­un­der­stood form, is a fully gen­eral ex­cuse not to be­lieve some­thing; since, af­ter all, you can’t be sure. Be­ware of fully gen­eral ex­cuses!

A fur­ther prob­lem is that hu­mil­ity is all too easy to pro­fess. Den­nett, in Break­ing the Spell: Reli­gion as a Nat­u­ral Phenomenon, points out that while many re­li­gious as­ser­tions are very hard to be­lieve, it is easy for peo­ple to be­lieve that they ought to be­lieve them. Den­nett terms this “be­lief in be­lief.” What would it mean to re­ally as­sume, to re­ally be­lieve, that three is equal to one? It’s a lot eas­ier to be­lieve that you should, some­how, be­lieve that three equals one, and to make this re­sponse at the ap­pro­pri­ate points in church. Den­nett sug­gests that much “re­li­gious be­lief” should be stud­ied as “re­li­gious pro­fes­sion”—what peo­ple think they should be­lieve and what they know they ought to say.

It is all too easy to meet ev­ery coun­ter­ar­gu­ment by say­ing, “Well, of course I could be wrong.” Then, hav­ing du­tifully gen­u­flected in the di­rec­tion of Modesty, hav­ing made the re­quired obei­sance, you can go on about your way with­out chang­ing a thing.

The temp­ta­tion is always to claim the most points with the least effort. The temp­ta­tion is to care­fully in­te­grate all in­com­ing news in a way that lets us change our be­liefs, and above all our ac­tions, as lit­tle as pos­si­ble. John Ken­neth Galbraith said: “Faced with the choice of chang­ing one’s mind and prov­ing that there is no need to do so, al­most ev­ery­one gets busy on the proof.”2 And the greater the in­con­ve­nience of chang­ing one’s mind, the more effort peo­ple will ex­pend on the proof.

But y’know, if you’re gonna do the same thing any­way, there’s no point in go­ing to such in­cred­ible lengths to ra­tio­nal­ize it. Often I have wit­nessed peo­ple en­coun­ter­ing new in­for­ma­tion, ap­par­ently ac­cept­ing it, and then care­fully ex­plain­ing why they are go­ing to do ex­actly the same thing they planned to do pre­vi­ously, but with a differ­ent jus­tifi­ca­tion. The point of think­ing is to shape our plans; if you’re go­ing to keep the same plans any­way, why bother go­ing to all that work to jus­tify it? When you en­counter new in­for­ma­tion, the hard part is to up­date, to re­act, rather than just let­ting the in­for­ma­tion dis­ap­pear down a black hole. And hu­mil­ity, prop­erly mi­s­un­der­stood, makes a won­der­ful black hole—all you have to do is ad­mit you could be wrong. There­fore it is writ­ten: “To be hum­ble is to take spe­cific ac­tions in an­ti­ci­pa­tion of your own er­rors. To con­fess your fal­li­bil­ity and then do noth­ing about it is not hum­ble; it is boast­ing of your mod­esty.”

1Charles S. Taber and Mil­ton Lodge, “Mo­ti­vated Skep­ti­cism in the Eval­u­a­tion of Poli­ti­cal Beliefs,” Amer­i­can Jour­nal of Poli­ti­cal Science 50, no. 3 (2006): 755–769.

2John Ken­neth Galbraith, Eco­nomics, Peace and Laugh­ter (Plume, 1981), 50.