Crisis of Faith

It ain’t a true crisis of faith un­less things could just as eas­ily go ei­ther way.

—Thor Shenkel

Many in this world re­tain be­liefs whose flaws a ten-year-old could point out, if that ten-year-old were hear­ing the be­liefs for the first time. Th­ese are not sub­tle er­rors we’re talk­ing about. They would be child’s play for an unattached mind to re­lin­quish, if the skep­ti­cism of a ten-year-old were ap­plied with­out eva­sion. As Premise Checker put it, “Had the idea of god not come along un­til the sci­en­tific age, only an ex­cep­tion­ally weird per­son would in­vent such an idea and pre­tend that it ex­plained any­thing.”1

And yet skil­lful sci­en­tific spe­cial­ists, even the ma­jor in­no­va­tors of a field, even in this very day and age, do not ap­ply that skep­ti­cism suc­cess­fully. No­bel lau­re­ate Robert Au­mann, of Au­mann’s Agree­ment The­o­rem, is an Ortho­dox Jew: I feel rea­son­ably con­fi­dent in ven­tur­ing that Au­mann must, at one point or an­other, have ques­tioned his faith. And yet he did not doubt suc­cess­fully. We change our minds less of­ten than we think.

This should scare you down to the mar­row of your bones. It means you can be a world-class sci­en­tist and con­ver­sant with Bayesian math­e­mat­ics and still fail to re­ject a be­lief whose ab­sur­dity a fresh-eyed ten-year-old could see. It shows the in­vin­cible defen­sive po­si­tion which a be­lief can cre­ate for it­self, if it has long fes­tered in your mind.

What does it take to defeat an er­ror that has built it­self a fortress?

But by the time you know it is an er­ror, it is already defeated. The dilemma is not “How can I re­ject long-held false be­lief X?” but “How do I know if long-held be­lief X is false?” Self-hon­esty is at its most frag­ile when we’re not sure which path is the righ­teous one. And so the ques­tion be­comes:

How can we cre­ate in our­selves a true crisis of faith, that could just as eas­ily go ei­ther way?

Reli­gion is the trial case we can all imag­ine.2 But if you have cut off all sym­pa­thy and now think of the­ists as evil mu­tants, then you won’t be able to imag­ine the real in­ter­nal tri­als they face. You won’t be able to ask the ques­tion:

What gen­eral strat­egy would a re­li­gious per­son have to fol­low in or­der to es­cape their re­li­gion?

I’m sure that some, look­ing at this challenge, are already rat­tling off a list of stan­dard athe­ist talk­ing points—“They would have to ad­mit that there wasn’t any Bayesian ev­i­dence for God’s ex­is­tence,” “They would have to see the moral eva­sions they were car­ry­ing out to ex­cuse God’s be­hav­ior in the Bible,” “They need to learn how to use Oc­cam’s Ra­zor—”

Wrong! Wrong wrong wrong! This kind of re­hearsal, where you just cough up points you already thought of long be­fore, is ex­actly the style of think­ing that keeps peo­ple within their cur­rent re­li­gions. If you stay with your cached thoughts, if your brain fills in the ob­vi­ous an­swer so fast that you can’t see origi­nally, you surely will not be able to con­duct a crisis of faith.

Maybe it’s just a ques­tion of not enough peo­ple read­ing Gödel, Escher, Bach at a suffi­ciently young age, but I’ve no­ticed that a large frac­tion of the pop­u­la­tion—even tech­ni­cal folk—have trou­ble fol­low­ing ar­gu­ments that go this meta.3 On my more pes­simistic days I won­der if the camel has two humps.

Even when it’s ex­plic­itly pointed out, some peo­ple seem­ingly can­not fol­low the leap from the ob­ject-level “Use Oc­cam’s Ra­zor! You have to see that your God is an un­nec­es­sary be­lief!” to the meta-level “Try to stop your mind from com­plet­ing the pat­tern the usual way!” Be­cause in the same way that all your ra­tio­nal­ist friends talk about Oc­cam’s Ra­zor like it’s a good thing, and in the same way that Oc­cam’s Ra­zor leaps right up into your mind, so too, the ob­vi­ous friend-ap­proved re­li­gious re­sponse is “God’s ways are mys­te­ri­ous and it is pre­sump­tu­ous to sup­pose that we can un­der­stand them.” So for you to think that the gen­eral strat­egy to fol­low is “Use Oc­cam’s Ra­zor,” would be like a the­ist say­ing that the gen­eral strat­egy is to have faith.

“But—but Oc­cam’s Ra­zor re­ally is bet­ter than faith! That’s not like prefer­ring a differ­ent fla­vor of ice cream! Any­one can see, look­ing at his­tory, that Oc­camian rea­son­ing has been far more pro­duc­tive than faith—”

Which is all true. But beside the point. The point is that you, say­ing this, are rat­tling off a stan­dard jus­tifi­ca­tion that’s already in your mind. The challenge of a crisis of faith is to han­dle the case where, pos­si­bly, our stan­dard con­clu­sions are wrong and our stan­dard jus­tifi­ca­tions are wrong. So if the stan­dard jus­tifi­ca­tion for X is “Oc­cam’s Ra­zor!” and you want to hold a crisis of faith around X, you should be ques­tion­ing if Oc­cam’s Ra­zor re­ally en­dorses X, if your un­der­stand­ing of Oc­cam’s Ra­zor is cor­rect, and—if you want to have suffi­ciently deep doubts—whether sim­plic­ity is the sort of crite­rion that has worked well his­tor­i­cally in this case, or could rea­son­ably be ex­pected to work, et cetera. If you would ad­vise a re­li­gion­ist to ques­tion their be­lief that “faith” is a good jus­tifi­ca­tion for X, then you should ad­vise your­self to put forth an equally strong effort to ques­tion your be­lief that “Oc­cam’s Ra­zor” is a good jus­tifi­ca­tion for X.4

If “Oc­cam’s Ra­zor!” is your usual re­ply, your stan­dard re­ply, the re­ply that all your friends give—then you’d bet­ter block your brain from in­stantly com­plet­ing that pat­tern, if you’re try­ing to in­sti­gate a true crisis of faith.

Bet­ter to think of such rules as, “Imag­ine what a skep­tic would say—and then imag­ine what they would say to your re­sponse—and then imag­ine what else they might say, that would be harder to an­swer.”

Or, “Try to think the thought that hurts the most.”

And above all, the rule:

Put forth the same level of des­per­ate effort that it would take for a the­ist to re­ject their re­li­gion.

Be­cause if you aren’t try­ing that hard, then—for all you know—your head could be stuffed full of non­sense as bad as re­li­gion.

Without a con­vul­sive, wrench­ing effort to be ra­tio­nal, the kind of effort it would take to throw off a re­li­gion—then how dare you be­lieve any­thing, when Robert Au­mann be­lieves in God?

Some­one (I for­get who) once ob­served that peo­ple had only un­til a cer­tain age to re­ject their re­li­gious faith. After­ward they would have an­swers to all the ob­jec­tions, and it would be too late. That is the kind of ex­is­tence you must sur­pass. This is a test of your strength as a ra­tio­nal­ist, and it is very se­vere; but if you can­not pass it, you will be weaker than a ten-year-old.

But again, by the time you know a be­lief is an er­ror, it is already defeated. So we’re not talk­ing about a des­per­ate, con­vul­sive effort to undo the effects of a re­li­gious up­bring­ing, af­ter you’ve come to the con­clu­sion that your re­li­gion is wrong. We’re talk­ing about a des­per­ate effort to figure out if you should be throw­ing off the chains, or keep­ing them. Self-hon­esty is at its most frag­ile when we don’t know which path we’re sup­posed to take—that’s when ra­tio­nal­iza­tions are not ob­vi­ously sins.

Not ev­ery doubt calls for stag­ing an all-out Cri­sis of Faith. But you should con­sider it when:

  • A be­lief has long re­mained in your mind;

  • It is sur­rounded by a cloud of known ar­gu­ments and re­fu­ta­tions;

  • You have sunk costs in it (time, money, pub­lic dec­la­ra­tions);

  • The be­lief has emo­tional con­se­quences (re­mem­ber that this does not make it wrong);

  • It has got­ten mixed up in your per­son­al­ity gen­er­ally.

None of these warn­ing signs are im­me­di­ate dis­proofs. Th­ese at­tributes place a be­lief at risk for all sorts of dan­gers, and make it very hard to re­ject when it is wrong. And they hold for Richard Dawk­ins’s be­lief in evolu­tion­ary biol­ogy, not just the Pope’s Catholi­cism.

Nor does this mean that we’re only talk­ing about differ­ent fla­vors of ice cream. Two be­liefs can in­spire equally deep emo­tional at­tach­ments with­out hav­ing equal ev­i­den­tial sup­port. The point is not to have shal­low be­liefs, but to have a map that re­flects the ter­ri­tory.

I em­pha­size this, of course, so that you can ad­mit to your­self, “My be­lief has these warn­ing signs,” with­out hav­ing to say to your­self, “My be­lief is false.”

But what these warn­ing signs do mark is a be­lief that will take more than an or­di­nary effort to doubt effec­tively. It will take more than an or­di­nary effort to doubt in such a way that if the be­lief is in fact false, you will in fact re­ject it. And where you can­not doubt in this way, you are blind, be­cause your brain will hold the be­lief un­con­di­tion­ally. When a retina sends the same sig­nal re­gard­less of the pho­tons en­ter­ing it, we call that eye blind.5

When should you stage a Cri­sis of Faith?

Again, think of the ad­vice you would give to a the­ist: If you find your­self feel­ing a lit­tle un­sta­ble in­wardly, but try­ing to ra­tio­nal­ize rea­sons the be­lief is still solid, then you should prob­a­bly stage a Cri­sis of Faith. If the be­lief is as solidly sup­ported as grav­ity, you needn’t bother—but think of all the the­ists who would des­per­ately want to con­clude that God is as solid as grav­ity. So try to imag­ine what the skep­tics out there would say to your “solid as grav­ity” ar­gu­ment. Cer­tainly, one rea­son you might fail at a crisis of faith is that you never re­ally sit down and ques­tion in the first place—that you never say, “Here is some­thing I need to put effort into doubt­ing prop­erly.”

If your thoughts get that com­pli­cated, you should go ahead and stage a Cri­sis of Faith. Don’t try to do it hap­haz­ardly; don’t try it in an ad-hoc spare mo­ment. Don’t rush to get it done with quickly, so that you can say, “I have doubted, as I was obliged to do.” That wouldn’t work for a the­ist, and it won’t work for you ei­ther. Rest up the pre­vi­ous day, so you’re in good men­tal con­di­tion. Allo­cate some un­in­ter­rupted hours. Find some­where quiet to sit down. Clear your mind of all stan­dard ar­gu­ments; try to see from scratch. And make a des­per­ate effort to put forth a true doubt that would de­stroy a false—and only a false—deeply held be­lief.

Ele­ments of the Cri­sis of Faith tech­nique have been scat­tered over many es­says:

  • Avoid­ing Your Belief’s Real Weak Points—One of the first temp­ta­tions in a crisis of faith is to doubt the strongest points of your be­lief, so that you can re­hearse your good an­swers. You need to seek out the most painful spots, not the ar­gu­ments that are most re­as­sur­ing to con­sider.

  • The Med­i­ta­tion on Cu­ri­os­ity—Roger Ze­lazny once dis­t­in­guished be­tween “want­ing to be an au­thor” ver­sus “want­ing to write,” and there is like­wise a dis­tinc­tion be­tween want­ing to have in­ves­ti­gated and want­ing to in­ves­ti­gate. It is not enough to say, “It is my duty to crit­i­cize my own be­liefs”; you must be cu­ri­ous, and only un­cer­tainty can cre­ate cu­ri­os­ity. Keep­ing in mind con­ser­va­tion of ex­pected ev­i­dence may help you up­date your­self in­cre­men­tally: for ev­ery sin­gle point that you con­sider, and each el­e­ment of new ar­gu­ment and new ev­i­dence, you should not ex­pect your be­liefs to shift more (on av­er­age) in one di­rec­tion than an­other. Thus you can be truly cu­ri­ous each time about how it will go.

  • Origi­nal See­ing—To pre­vent stan­dard cached thoughts from rush­ing in and com­plet­ing the pat­tern.

  • The Li­tany of Gendlin and the Li­tany of Tarski—Peo­ple can stand what is true, for they are already en­dur­ing it. If a be­lief is true, you will be bet­ter off be­liev­ing it, and if it is false, you will be bet­ter off re­ject­ing it. You would ad­vise a re­li­gious per­son to try to vi­su­al­ize fully and deeply the world in which there is no God, and to, with­out ex­cuses, come to the full un­der­stand­ing that if there is no God then they will be bet­ter off be­liev­ing there is no God. If one can­not come to ac­cept this on a deep emo­tional level, one will not be able to have a crisis of faith. So you should put in a sincere effort to vi­su­al­ize the al­ter­na­tive to your be­lief, the way that the best and high­est skep­tic would want you to vi­su­al­ize it. Think of the effort a re­li­gion­ist would have to put forth to imag­ine, with­out cor­rupt­ing it for their own com­fort, an athe­ist’s view of the uni­verse.

  • Tsuyoku Nar­i­tai!—The drive to be­come stronger.

  • The Ge­netic Heuris­tic—You should be ex­tremely sus­pi­cious if you have many ideas sug­gested by a source that you now know to be un­trust­wor­thy, but by golly, it seems that all the ideas still ended up be­ing right.

  • The Im­por­tance of Say­ing “Oops”—It re­ally is less painful to swal­low the en­tire bit­ter pill in one ter­rible gulp.

  • Sin­gle­think—The op­po­site of dou­ble­think. See the thoughts you flinch away from, that ap­pear in the cor­ner of your mind for just a mo­ment be­fore you re­fuse to think them. If you be­come aware of what you are not think­ing, you can think it.

  • Affec­tive Death Spirals and Re­sist the Happy Death Spiral—Affec­tive death spirals are prime gen­er­a­tors of false be­liefs that it will take a Cri­sis of Faith to shake loose. But since af­fec­tive death spirals can also get started around real things that are gen­uinely nice, you don’t have to ad­mit that your be­lief is a lie, to try and re­sist the halo effect at ev­ery point—re­fuse false praise even of gen­uinely nice things. Policy de­bates should not ap­pear one-sided.

  • Hold Off On Propos­ing Solu­tions—Don’t pro­pose any solu­tions un­til the prob­lem has been dis­cussed as thor­oughly as pos­si­ble. Make your mind hold off on know­ing what its an­swer will be; and try for five min­utes be­fore giv­ing up—both gen­er­ally, and es­pe­cially when pur­su­ing the devil’s point of view.

And these stan­dard tech­niques, dis­cussed in How to Ac­tu­ally Change Your Mind and Map and Ter­ri­tory, are par­tic­u­larly rele­vant:

But re­ally, there’s rather a lot of rele­vant ma­te­rial, here and on Over­com­ing Bias. There are ideas I have yet to prop­erly in­tro­duce. There is the con­cept of isshoken­mei—the des­per­ate, ex­traor­di­nary, con­vul­sive effort to be ra­tio­nal. The effort that it would take to sur­pass the level of Robert Au­mann and all the great sci­en­tists through­out his­tory who never broke free of their faiths.

The Cri­sis of Faith is only the crit­i­cal point and sud­den clash of the longer isshouken­mei—the lifelong un­com­pro­mis­ing effort to be so in­cred­ibly ra­tio­nal that you rise above the level of stupid damn mis­takes. It’s when you get a chance to use the skills that you’ve been prac­tic­ing for so long, all-out against your­self.

I wish you the best of luck against your op­po­nent. Have a won­der­ful crisis!

1See “Oc­cam’s Ra­zor” (in Map and Ter­ri­tory).

2Read­ers born to athe­ist par­ents have missed out on a fun­da­men­tal life trial, and must make do with the poor sub­sti­tute of think­ing of their re­li­gious friends.

3See “Archimedes’s Chro­mo­phone” (http://​​less­wrong.com/​​lw/​​h5/​​archimedess_chrono­phone) and “Chro­mo­phone Mo­ti­va­tions” (http://​​less­wrong.com/​​lw/​​h6/​​chrono­phone_mo­ti­va­tions).

4Think of all the peo­ple out there who don’t un­der­stand the Min­i­mum De­scrip­tion Length or Solomonoff in­duc­tion for­mu­la­tions of Oc­cam’s Ra­zor, who think that Oc­cam’s Ra­zor out­laws many-wor­lds or the simu­la­tion hy­poth­e­sis. They would need to ques­tion their for­mu­la­tions of Oc­cam’s Ra­zor and their no­tions of why sim­plic­ity is a good thing. What­ever X in con­tention you just jus­tified by say­ing “Oc­cam’s Ra­zor!” is, I bet, not the same level of Oc­camian slam dunk as grav­ity.

5Com­pare “What Is Ev­i­dence?” in Map and Ter­ri­tory.