My Wild and Reckless Youth

It is said that par­ents do all the things they tell their chil­dren not to do, which is how they know not to do them.

Long ago, in the un­think­ably dis­tant past, I was a de­voted Tra­di­tional Ra­tion­al­ist, con­ceiv­ing my­self skil­led ac­cord­ing to that kind, yet I knew not the Way of Bayes. When the young Eliezer was con­fronted with a mys­te­ri­ous-seem­ing ques­tion, the pre­cepts of Tra­di­tional Ra­tion­al­ity did not stop him from de­vis­ing a Mys­te­ri­ous An­swer. It is, by far, the most em­bar­rass­ing mis­take I made in my life, and I still wince to think of it.

What was my mys­te­ri­ous an­swer to a mys­te­ri­ous ques­tion? This I will not de­scribe, for it would be a long tale and com­pli­cated. I was young, and a mere Tra­di­tional Ra­tion­al­ist who knew not the teach­ings of Tver­sky and Kah­ne­man. I knew about Oc­cam’s Ra­zor, but not the con­junc­tion fal­lacy. I thought I could get away with think­ing com­pli­cated thoughts my­self, in the liter­ary style of the com­pli­cated thoughts I read in sci­ence books, not re­al­iz­ing that cor­rect com­plex­ity is only pos­si­ble when ev­ery step is pinned down over­whelm­ingly. To­day, one of the chief pieces of ad­vice I give to as­piring young ra­tio­nal­ists is “Do not at­tempt long chains of rea­son­ing or com­pli­cated plans.”

Noth­ing more than this need be said: even af­ter I in­vented my “an­swer,” the phe­nomenon was still a mys­tery unto me, and pos­sessed the same qual­ity of won­drous im­pen­e­tra­bil­ity that it had at the start.

Make no mis­take, that younger Eliezer was not stupid. All the er­rors of which the young Eliezer was guilty are still be­ing made to­day by re­spected sci­en­tists in re­spected jour­nals. It would have taken a sub­tler skill to pro­tect him than ever he was taught as a Tra­di­tional Ra­tion­al­ist.

In­deed, the young Eliezer dili­gently and painstak­ingly fol­lowed the in­junc­tions of Tra­di­tional Ra­tion­al­ity in the course of go­ing astray.

As a Tra­di­tional Ra­tion­al­ist, the young Eliezer was care­ful to en­sure that his Mys­te­ri­ous An­swer made a bold pre­dic­tion of fu­ture ex­pe­rience. Namely, I ex­pected fu­ture neu­rol­o­gists to dis­cover that neu­rons were ex­ploit­ing quan­tum grav­ity, a la Sir Roger Pen­rose. This re­quired neu­rons to main­tain a cer­tain de­gree of quan­tum co­her­ence, which was some­thing you could look for, and find or not find. Either you ob­serve that or you don’t, right?

But my hy­poth­e­sis made no ret­ro­spec­tive pre­dic­tions. Ac­cord­ing to Tra­di­tional Science, ret­ro­spec­tive pre­dic­tions don’t count—so why bother mak­ing them? To a Bayesian, on the other hand, if a hy­poth­e­sis does not to­day have a fa­vor­able like­li­hood ra­tio over “I don’t know,” it raises the ques­tion of why you to­day be­lieve any­thing more com­pli­cated than “I don’t know.” But I knew not the Way of Bayes, so I was not think­ing about like­li­hood ra­tios or fo­cus­ing prob­a­bil­ity den­sity. I had Made a Falsifi­able Pre­dic­tion; was this not the Law?

As a Tra­di­tional Ra­tion­al­ist, the young Eliezer was care­ful not to be­lieve in magic, mys­ti­cism, car­bon chau­vinism, or any­thing of that sort. I proudly pro­fessed of my Mys­te­ri­ous An­swer, “It is just physics like all the rest of physics!” As if you could save magic from be­ing a cog­ni­tive iso­morph of magic, by call­ing it quan­tum grav­ity. But I knew not the Way of Bayes, and did not see the level on which my idea was iso­mor­phic to magic. I gave my alle­giance to physics, but this did not save me; what does prob­a­bil­ity the­ory know of alle­giances? I avoided ev­ery­thing that Tra­di­tional Ra­tion­al­ity told me was for­bid­den, but what was left was still magic.

Beyond a doubt, my alle­giance to Tra­di­tional Ra­tion­al­ity helped me get out of the hole I dug my­self into. If I hadn’t been a Tra­di­tional Ra­tion­al­ist, I would have been com­pletely screwed. But Tra­di­tional Ra­tion­al­ity still wasn’t enough to get it right. It just led me into differ­ent mis­takes than the ones it had ex­plic­itly for­bid­den.

When I think about how my younger self very care­fully fol­lowed the rules of Tra­di­tional Ra­tion­al­ity in the course of get­ting the an­swer wrong, it sheds light on the ques­tion of why peo­ple who call them­selves “ra­tio­nal­ists” do not rule the world. You need one whole hell of a lot of ra­tio­nal­ity be­fore it does any­thing but lead you into new and in­ter­est­ing mis­takes.

Tra­di­tional Ra­tion­al­ity is taught as an art, rather than a sci­ence; you read the bi­og­ra­phy of fa­mous physi­cists de­scribing the les­sons life taught them, and you try to do what they tell you to do. But you haven’t lived their lives, and half of what they’re try­ing to de­scribe is an in­stinct that has been trained into them.

The way Tra­di­tional Ra­tion­al­ity is de­signed, it would have been ac­cept­able for me to spend thirty years on my silly idea, so long as I suc­ceeded in falsify­ing it even­tu­ally, and was hon­est with my­self about what my the­ory pre­dicted, and ac­cepted the dis­proof when it ar­rived, et cetera. This is enough to let the Ratchet of Science click for­ward, but it’s a lit­tle harsh on the peo­ple who waste thirty years of their lives. Tra­di­tional Ra­tion­al­ity is a walk, not a dance. It’s de­signed to get you to the truth even­tu­ally, and gives you all too much time to smell the flow­ers along the way.

Tra­di­tional Ra­tion­al­ists can agree to dis­agree. Tra­di­tional Ra­tion­al­ity doesn’t have the ideal that think­ing is an ex­act art in which there is only one cor­rect prob­a­bil­ity es­ti­mate given the ev­i­dence. In Tra­di­tional Ra­tion­al­ity, you’re al­lowed to guess, and then test your guess. But ex­pe­rience has taught me that if you don’t know, and you guess, you’ll end up be­ing wrong.

The Way of Bayes is also an im­pre­cise art, at least the way I’m hold­ing forth upon it. Th­ese es­says are still fum­bling at­tempts to put into words les­sons that would be bet­ter taught by ex­pe­rience. But at least there’s un­der­ly­ing math, plus ex­per­i­men­tal ev­i­dence from cog­ni­tive psy­chol­ogy on how hu­mans ac­tu­ally think. Maybe that will be enough to cross the strato­spher­i­cally high thresh­old re­quired for a dis­ci­pline that lets you ac­tu­ally get it right, in­stead of just con­strain­ing you into in­ter­est­ing new mis­takes.