No Safe Defense, Not Even Science

I don’t ask my friends about their child­hoods—I lack so­cial cu­ri­os­ity—and so I don’t know how much of a trend this re­ally is:

Of the peo­ple I know who are reach­ing up­ward as ra­tio­nal­ists, who vol­un­teer in­for­ma­tion about their child­hoods, there is a sur­pris­ing ten­dency to hear things like: “My fam­ily joined a cult and I had to break out,” or “One of my par­ents was clini­cally in­sane and I had to learn to filter out re­al­ity from their mad­ness.”

My own ex­pe­rience with grow­ing up in an Ortho­dox Jewish fam­ily seems tame by com­par­i­son… but it ac­com­plished the same out­come: It broke my core emo­tional trust in the san­ity of the peo­ple around me.

Un­til this core emo­tional trust is bro­ken, you don’t start grow­ing as a ra­tio­nal­ist. I have trou­ble putting into words why this is so. Maybe any un­usual skills you ac­quire—any­thing that makes you un­usu­ally ra­tio­nal—re­quires you to zig when other peo­ple zag. Maybe that’s just too scary, if the world still seems like a sane place unto you.

Or maybe you don’t bother putting in the hard work to be ex­tra bonus sane, if nor­mal­ity doesn’t scare the hell out of you.

I know that many as­piring ra­tio­nal­ists seem to run into road­blocks around things like cry­on­ics or many-wor­lds. Not that they don’t see the logic; they see the logic and won­der, “Can this re­ally be true, when it seems so ob­vi­ous now, and yet none of the peo­ple around me be­lieve it?”

Yes. Wel­come to the Earth where ethanol is made from corn and en­vi­ron­men­tal­ists op­pose nu­clear power. I’m sorry.

(See also: Cul­tish Coun­ter­cultish­ness. If you end up in the frame of mind of ner­vously seek­ing re­as­surance, this is never a good thing—even if it’s be­cause you’re about to be­lieve some­thing that sounds log­i­cal but could cause other peo­ple to look at you funny.)

Peo­ple who’ve had their trust bro­ken in the san­ity of the peo­ple around them, seem to be able to eval­u­ate strange ideas on their mer­its, with­out feel­ing ner­vous about their strangeness. The glue that binds them to their cur­rent place has dis­solved, and they can walk in some di­rec­tion, hope­fully for­ward.

Lonely dis­sent, I called it. True dis­sent doesn’t feel like go­ing to school wear­ing black; it feels like go­ing to school wear­ing a clown suit.

That’s what it takes to be the lone voice who says, “If you re­ally think you know who’s go­ing to win the elec­tion, why aren’t you pick­ing up the free money on the In­trade pre­dic­tion mar­ket?” while all the peo­ple around you are think­ing, “It is good to be an in­di­vi­d­ual and form your own opinions, the shoe com­mer­cials told me so.”

Maybe in some other world, some al­ter­nate Everett branch with a saner hu­man pop­u­la­tion, things would be differ­ent… but in this world, I’ve never seen any­one be­gin to grow as a ra­tio­nal­ist un­til they make a deep emo­tional break with the wis­dom of their pack.

Maybe in an­other world, things would be differ­ent. And maybe not. I’m not sure that hu­man be­ings re­al­is­ti­cally can trust and think at the same time.

Once upon a time, there was some­thing I trusted.

Eliezer18 trusted Science.

Eliezer18 du­tifully ac­knowl­edged that the so­cial pro­cess of sci­ence was flawed. Eliezer18 du­tifully ac­knowl­edged that academia was slow, and mis­al­lo­cated re­sources, and played fa­vorites, and mis­treated its pre­cious heretics.

That’s the con­ve­nient thing about ac­knowl­edg­ing flaws in peo­ple who failed to live up to your ideal; you don’t have to ques­tion the ideal it­self.

But who could pos­si­bly be fool­ish enough to ques­tion, “The ex­per­i­men­tal method shall de­cide which hy­poth­e­sis wins”?

Part of what fooled Eliezer18 was a gen­eral prob­lem he had, with an aver­sion to ideas that re­sem­bled things idiots had said. Eliezer18 had seen plenty of peo­ple ques­tion­ing the ideals of Science It­self, and with­out ex­cep­tion they were all on the Dark Side. Peo­ple who ques­tioned the ideal of Science were in­vari­ably try­ing to sell you snake oil, or try­ing to safe­guard their fa­vorite form of stu­pidity from crit­i­cism, or try­ing to dis­guise their per­sonal res­ig­na­tion as a Deeply Wise ac­cep­tance of fu­til­ity.

If there’d been any other ideal that was a few cen­turies old, the young Eliezer would have looked at it and said, “I won­der if this is re­ally right, and whether there’s a way to do bet­ter.” But not the ideal of Science. Science was the mas­ter idea, the idea that let you change ideas. You could ques­tion it, but you were meant to ques­tion it and then ac­cept it, not ac­tu­ally say, “Wait! This is wrong!”

Thus, when once upon a time I came up with a stupid idea, I thought I was be­hav­ing vir­tu­ously if I made sure there was a Novel Pre­dic­tion, and pro­fessed that I wished to test my idea ex­per­i­men­tally. I thought I had done ev­ery­thing I was obliged to do.

So I thought I was safe—not safe from any par­tic­u­lar ex­ter­nal threat, but safe on some deeper level, like a child who trusts their par­ent and has obeyed all the par­ent’s rules.

I’d long since been bro­ken of trust in the san­ity of my fam­ily or my teach­ers at school. And the other chil­dren weren’t in­tel­li­gent enough to com­pete with the con­ver­sa­tions I could have with books. But I trusted the books, you see. I trusted that if I did what Richard Feyn­man told me to do, I would be safe. I never thought those words aloud, but it was how I felt.

When Eliezer23 re­al­ized ex­actly how stupid the stupid the­ory had been—and that Tra­di­tional Ra­tion­al­ity had not saved him from it—and that Science would have been perfectly okay with his wast­ing ten years test­ing the stupid idea, so long as af­ter­ward he ad­mit­ted it was wrong...

...well, I’m not go­ing to say it was a huge emo­tional con­vul­sion. I don’t re­ally go in for that kind of drama. It sim­ply be­came ob­vi­ous that I’d been stupid.

That’s the trust I’m try­ing to break in you. You are not safe. Ever.

Not even Science can save you. The ideals of Science were born cen­turies ago, in a time when no one knew any­thing about prob­a­bil­ity the­ory or cog­ni­tive bi­ases. Science de­mands too lit­tle of you, it blesses your good in­ten­tions too eas­ily, it is not strict enough, it only makes those in­junc­tions that an av­er­age sci­en­tist can fol­low, it ac­cepts slow­ness as a fact of life.

So don’t think that if you only fol­low the rules of Science, that makes your rea­son­ing defen­si­ble.

There is no known pro­ce­dure you can fol­low that makes your rea­son­ing defen­si­ble.

There is no known set of in­junc­tions which you can satisfy, and know that you will not have been a fool.

There is no known moral­ity-of-rea­son­ing that you can do your best to obey, and know that you are thereby shielded from crit­i­cism.

No, not even if you turn to Bayescraft. It’s much harder to use and you’ll never be sure that you’re do­ing it right.

The dis­ci­pline of Bayescraft is younger by far than the dis­ci­pline of Science. You will find no text­books, no el­derly men­tors, no his­to­ries writ­ten of suc­cess and failure, no hard-and-fast rules laid down. You will have to study cog­ni­tive bi­ases, and prob­a­bil­ity the­ory, and evolu­tion­ary psy­chol­ogy, and so­cial psy­chol­ogy, and other cog­ni­tive sci­ences, and Ar­tifi­cial In­tel­li­gence—and think through for your­self how to ap­ply all this knowl­edge to the case of cor­rect­ing your­self, since that isn’t yet in the text­books.

You don’t know what your own mind is re­ally do­ing. They find a new cog­ni­tive bias ev­ery week and you’re never sure if you’ve cor­rected for it, or over­cor­rected.

The for­mal math is im­pos­si­ble to ap­ply. It doesn’t break down as eas­ily as John Q. Un­be­liever thinks, but you’re never re­ally sure where the foun­da­tions come from. You don’t know why the uni­verse is sim­ple enough to un­der­stand, or why any prior works for it. You don’t know what your own pri­ors are, let alone if they’re any good.

One of the prob­lems with Science is that it’s too vague to re­ally scare you. “Ideas should be tested by ex­per­i­ment.” How can you go wrong with that?

On the other hand, if you have some math of prob­a­bil­ity the­ory laid out in front of you, and worse, you know you can’t ac­tu­ally use it, then it be­comes clear that you are try­ing to do some­thing difficult, and that you might well be do­ing it wrong.

So you can­not trust.

And all this that I have said, will not be suffi­cient to break your trust. That won’t hap­pen un­til you get into your first real dis­aster from fol­low­ing The Rules, not from break­ing them.

Eliezer18 already had the no­tion that you were al­lowed to ques­tion Science. Why, of course the sci­en­tific method was not it­self im­mune to ques­tion­ing! For are we not all good ra­tio­nal­ists? Are we not al­lowed to ques­tion ev­ery­thing?

It was the no­tion that you could ac­tu­ally in real life fol­low Science and fail mis­er­ably, that Eliezer18 didn’t re­ally, emo­tion­ally be­lieve was pos­si­ble.

Oh, of course he said it was pos­si­ble. Eliezer18 du­tifully ac­knowl­edged the pos­si­bil­ity of er­ror, say­ing, “I could be wrong, but...”

But he didn’t think failure could hap­pen in, you know, real life. You were sup­posed to look for flaws, not ac­tu­ally find them.

And this emo­tional differ­ence is a ter­ribly difficult thing to ac­com­plish in words, and I fear there’s no way I can re­ally warn you.

Your trust will not break, un­til you ap­ply all that you have learned here and from other books, and take it as far as you can go, and find that this too fails you—that you have still been a fool, and no one warned you against it—that all the most im­por­tant parts were left out of the guidance you re­ceived—that some of the most pre­cious ideals you fol­lowed, steered you in the wrong di­rec­tion—

—and if you still have some­thing to pro­tect, so that you must keep go­ing, and can­not re­sign and wisely ac­knowl­edge the limi­ta­tions of ra­tio­nal­ity—

then you will be ready to start your jour­ney as a ra­tio­nal­ist. To take sole re­spon­si­bil­ity, to live with­out any trust­wor­thy defenses, and to forge a higher Art than the one you were once taught.

No one be­gins to truly search for the Way un­til their par­ents have failed them, their gods are dead, and their tools have shat­tered in their hand.


Post Scrip­tum: On re­view­ing a draft of this es­say, I dis­cov­ered a fairly in­ex­cus­able flaw in rea­son­ing, which ac­tu­ally af­fects one of the con­clu­sions drawn. I am leav­ing it in. Just in case you thought that tak­ing my ad­vice made you safe; or that you were sup­posed to look for flaws, but not find any.

And of course, if you look too hard for a flaw, and find a flaw that is not a real flaw, and cling to it to re­as­sure your­self of how crit­i­cal you are, you will only be worse off than be­fore...

It is liv­ing with un­cer­tainty—know­ing on a gut level that there are flaws, they are se­ri­ous and you have not found them—that is the difficult thing.