Extreme Rationality: It’s Not That Great

Re­lated to: In­di­vi­d­ual Ra­tion­al­ity is a Mat­ter of Life and Death, The Benefits of Ra­tion­al­ity, Ra­tion­al­ity is Sys­tem­atized Win­ning
But I fi­nally snapped af­ter read­ing: Manda­tory Se­cret Iden­tities

Okay, the ti­tle was for shock value. Ra­tion­al­ity is pretty great. Just not quite as great as ev­ery­one here seems to think it is.

For this post, I will be us­ing “ex­treme ra­tio­nal­ity” or “x-ra­tio­nal­ity” in the sense of “tech­niques and the­o­ries from Over­com­ing Bias, Less Wrong, or similar de­liber­ate for­mal ra­tio­nal­ity study pro­grams, above and be­yond the stan­dard level of ra­tio­nal­ity pos­sessed by an in­tel­li­gent sci­ence-liter­ate per­son with­out for­mal ra­tio­nal­ist train­ing.” It seems pretty un­con­tro­ver­sial that there are mas­sive benefits from go­ing from a com­pletely ir­ra­tional mo­ron to the av­er­age in­tel­li­gent per­son’s level. I’m coin­ing this new term so there’s no temp­ta­tion to con­fuse x-ra­tio­nal­ity with nor­mal, lower-level ra­tio­nal­ity.

And for this post, I use “benefits” or “prac­ti­cal benefits” to mean any­thing not re­lat­ing to philos­o­phy, truth, win­ning de­bates, or a sense of per­sonal satis­fac­tion from un­der­stand­ing things bet­ter. Money, sta­tus, pop­u­lar­ity, and sci­en­tific dis­cov­ery all count.

So, what are these “benefits” of “x-ra­tio­nal­ity”?

A while back, Vladimir Nesov asked ex­actly that, and made a thread for peo­ple to list all of the pos­i­tive effects x-ra­tio­nal­ity had on their lives. Only a hand­ful re­sponded, and most re­sponses weren’t very prac­ti­cal. Anna Sala­mon, one of the few peo­ple to give a re­ally im­pres­sive list of benefits, wrote:

I’m sur­prised there are so few ap­par­ent gains listed. Are most peo­ple who benefited just be­ing silent? We should ex­pect a cer­tain num­ber of headache-cures, etc., just by placebo effects or co­in­ci­dences of timing.

There have since been a few more peo­ple claiming prac­ti­cal benefits from x-ra­tio­nal­ity, but we should gen­er­ally ex­pect more peo­ple to claim benefits than to ac­tu­ally ex­pe­rience them. Anna men­tions the placebo effect, and to that I would add cog­ni­tive dis­so­nance—peo­ple spent all this time learn­ing x-ra­tio­nal­ity, so it MUST have helped them! - and the same sort of con­fir­ma­tion bias that makes Chris­ti­ans swear that their prayers re­ally work.

I find my per­sonal ex­pe­rience in ac­cord with the ev­i­dence from Vladimir’s thread. I’ve got­ten countless clar­ity-of-mind benefits from Over­com­ing Bias’ x-ra­tio­nal­ity, but prac­ti­cal benefits? Aside from some periph­eral dis­ci­plines1, I can’t think of any.

Look­ing over his­tory, I do not find any ten­dency for suc­cess­ful peo­ple to have made a for­mal study of x-ra­tio­nal­ity. This isn’t en­tirely fair, be­cause the dis­ci­pline has ex­panded vastly over the past fifty years, but the ba­sics—syl­l­o­gisms, fal­la­cies, and the like—have been around much longer. The few groups who made a con­certed effort to study x-ra­tio­nal­ity didn’t shoot off an un­usual num­ber of ge­niuses—the Korzyb­ski­ans are a good ex­am­ple. In fact as far as I know the only fol­lower of Korzyb­ski to turn his ideas into a vast per­sonal em­pire of fame and for­tune was (iron­i­cally!) L. Ron Hub­bard, who took the ba­sic con­cept of tech­niques to purge con­fu­sions from the mind, re­placed the sub­stance with a bunch of at­trac­tive flim-flam, and founded Scien­tol­ogy. And like Hub­bard’s su­per­star fol­low­ers, many of this cen­tury’s most suc­cess­ful peo­ple have been no­tably ir­ra­tional.

There seems to me to be ap­prox­i­mately zero em­piri­cal ev­i­dence that x-ra­tio­nal­ity has a large effect on your prac­ti­cal suc­cess, and some anec­do­tal em­piri­cal ev­i­dence against it. The ev­i­dence in fa­vor of the propo­si­tion right now seems to be its sheer ob­vi­ous­ness. Ra­tion­al­ity is the study of know­ing the truth and mak­ing good de­ci­sions. How the heck could know­ing more than ev­ery­one else and mak­ing bet­ter de­ci­sions than them not make you more suc­cess­ful?!?

This is a difficult ques­tion, but I think it has an an­swer. A com­plex, mul­ti­fac­to­rial an­swer, but an an­swer.

One fac­tor we have to once again come back to is akra­sia2. I find akra­sia in my­self and oth­ers to be the most im­por­tant limit­ing fac­tor to our suc­cess. Think of that phrase “limit­ing fac­tor” for­mally, the way you’d think of the limit­ing reagent in chem­istry. When there’s a limit­ing reagent, it doesn’t mat­ter how much more of the other reagents you add, the re­ac­tion’s not go­ing to make any more product. Ra­tional de­ci­sions are prac­ti­cally use­less with­out the willpower to carry them out. If our limit­ing reagent is willpower and not ra­tio­nal­ity, throw­ing truck­loads of ra­tio­nal­ity into our brains isn’t go­ing to in­crease suc­cess very much.

This is a very large part of the story, but not the whole story. If I was ra­tio­nal enough to pick only stocks that would go up, I’d be­come suc­cess­ful re­gard­less of how lit­tle willpower I had, as long as it was enough to pick up the phone and call my bro­ker.

So the sec­ond fac­tor is that most peo­ple are ra­tio­nal enough for their own pur­poses. Oh, they go on wild flights of fancy when dis­cussing poli­tics or re­li­gion or philos­o­phy, but when it comes to busi­ness they sud­denly be­come cold and calcu­lat­ing. This re­lates to Robin Han­son on Near and Far modes of think­ing. Near Mode think­ing is ac­tu­ally pretty good at a lot of things, and Near Mode think­ing is the think­ing whose ac­cu­racy gives us prac­ti­cal benefits.

And—when I was young, I used to watch The Jour­ney of Allen Strange on Nick­leodeon. It was a chil­dren’s show about this alien who came to Earth and lived with these kids. I re­mem­ber one scene where Allen the Alien was watch­ing the kids play pool. “That’s amaz­ing,” Allen told them. “I could never calcu­late differ­en­tial equa­tions in my head that quickly.” The kids had to con­vince him that “it’s in the arm, not the head”—that even though the move­ment of the balls is gov­erned by differ­en­tial equa­tions, hu­mans don’t ac­tu­ally calcu­late the equa­tions each time they play. They just move their arm in a way that feels right. If Allen had been smarter, he could have ex­plained that the kids were do­ing some very im­pres­sive math­e­mat­ics on a sub­con­scious level that pro­duced their arm’s per­cep­tion of “feel­ing right”. But the kids’ point still stands; even though in the­ory ex­plicit math­e­mat­ics will pro­duce bet­ter re­sults than eye­bal­ling it, in prac­tice you can’t be­come a good pool player just by study­ing calcu­lus.

A lot of hu­man ra­tio­nal­ity fol­lows the same pat­tern. Isaac New­ton is fre­quently named as a guy who knew no for­mal the­o­ries of sci­ence or ra­tio­nal­ity, who was hope­lessly ir­ra­tional in his philo­soph­i­cal be­liefs and his per­sonal life, but who is still widely and jus­tifi­ably con­sid­ered the great­est sci­en­tist who ever lived. Would New­ton have gone even fur­ther if he’d known Bayes the­ory? Prob­a­bly it would’ve been like tel­ling the world pool cham­pion to try us­ing more calcu­lus in his shots: not a pretty sight.

Yes, yes, beisut­sukai should be able to de­velop quan­tum grav­ity in a month and so on. But un­til some­one on Less Wrong ac­tu­ally goes and does it, that story sounds a lot like when Alfred Korzyb­ski claimed that World War Two could have been pre­vented if ev­ery­one had just used more Gen­eral Se­man­tics.

And then there’s just plain noise. Your suc­cess in the world de­pends on things rang­ing from your hairstyle to your height to your so­cial skills to your IQ score to cog­ni­tive con­structs psy­chol­o­gists don’t even have names for yet. X-Ra­tion­al­ity can help you suc­ceed. But so can ex­cel­lent fash­ion sense. It’s not clear in real-world terms that x-ra­tio­nal­ity has more of an effect than fash­ion. And don’t dis­miss that with “A good x-ra­tio­nal­ist will know if fash­ion is im­por­tant, and study fash­ion.” A good nor­mal ra­tio­nal­ist could do that too; it’s not a spe­cific ad­van­tage of x-ra­tio­nal­ism, just of hav­ing a gen­eral ra­tio­nal out­look. And hav­ing a gen­eral ra­tio­nal out­look, as I men­tioned be­fore, is limited in its effec­tive­ness by poor ap­pli­ca­tion and akra­sia.

I no longer be­lieve mas­ter­ing all these Over­com­ing Bias and Less Wrong tech­niques will turn me into Anasûrim­bor Kel­lhus or John Galt. I no longer even be­lieve mas­ter­ing all these Over­com­ing Bias tech­niques will turn me into Eliezer Yud­kowsky (who, as his writ­ings from 2001 in­di­cate, had de­vel­oped his char­ac­ter­is­tic level of awe­some­ness be­fore he be­came in­ter­ested in x-ra­tio­nal­ity at all)3. I think it may help me suc­ceed in life a lit­tle, but I think the cor­re­la­tion be­tween x-ra­tio­nal­ity and suc­cess is prob­a­bly closer to 0.1 than to 1. Maybe 0.2 in some busi­nesses like fi­nance, but peo­ple in fi­nance tend to know this and use spe­cially de­vel­oped x-ra­tio­nal­ist tech­niques on the job already with­out mak­ing it a lifestyle com­mit­ment. I think it was pri­mar­ily a Happy Death Spiral around how won­der­fully su­per-awe­some x-ra­tio­nal­ity was that made me once think oth­er­wise.

And this is why I am not so im­pressed by Eliezer’s claim that an x-ra­tio­nal­ity in­struc­tor should be suc­cess­ful in their non-ra­tio­nal­ity life. Yes, there prob­a­bly are some x-ra­tio­nal­ists who will also be suc­cess­ful peo­ple. But again, cor­re­la­tion 0.1. Stop say­ing only prac­ti­cally suc­cess­ful peo­ple could be good x-ra­tio­nal­ity teach­ers! Stop say­ing we need to start hav­ing huge real-life vic­to­ries or our art is use­less! Stop call­ing x-ra­tio­nal­ity the Art of Win­ning! Stop say­ing I must be en­gaged in some sort of weird sig­nal­ling effort for say­ing I’m here be­cause I like men­tal clar­ity in­stead of be­cause I want to be the next Bill Gates! It triv­ial­izes the very virtues that brought most of us to Over­com­ing Bias, and re­places them with what sounds a lot like a pitch for some weird self-help cult...



...but you will dis­agree with me. And we are both as­piring ra­tio­nal­ists, and there­fore we re­solve dis­agree­ments by ex­per­i­ments. I pro­pose one.

For the next time pe­riod—a week, a month, what­ever—take spe­cial note of ev­ery de­ci­sion you make. By “de­ci­sion”, I don’t mean the de­ci­sion to get up in the morn­ing, I mean the sort that’s made on a con­scious level and re­quires at least a few sec­onds’ se­ri­ous thought. Make a tick mark, literal or men­tal, so you can count how many of these there are.

Then note whether you make that de­ci­sion ra­tio­nally. If yes, also record whether you made that de­ci­sion x-ra­tio­nally. I don’t just mean you spent a brief sec­ond think­ing about whether any bi­ases might have af­fected your choice. I mean one where you think there’s a se­ri­ous (let’s ar­bi­trar­ily say 33%) chance that us­ing x-ra­tio­nal­ity in­stead of nor­mal ra­tio­nal­ity ac­tu­ally changed the re­sult of your de­ci­sion.

Fi­nally, note whether, once you came to the ra­tio­nal con­clu­sion, you ac­tu­ally fol­lowed it. This is not a triv­ial mat­ter. For ex­am­ple, be­fore writ­ing this blog post I won­dered briefly whether I should use the time study­ing in­stead, used nor­mal (but not x-) ra­tio­nal­ity to de­ter­mine that yes, I should, and then pro­ceeded to write this any­way. And if you get that far, note whether your x-ra­tio­nal de­ci­sions tend to turn out par­tic­u­larly well.

This ex­per­i­ment seems easy to rig4; merely do­ing it should in­crease your level of con­scious ra­tio­nal de­ci­sions quite a bit. And yet I have been try­ing it for the past few days, and the re­sults have not been pretty. Not pretty at all. Not only do I make fewer con­scious de­ci­sions than I thought, but the ones I do make I rarely ap­ply even the slight­est mod­icum of ra­tio­nal­ity to, and the ones I ap­ply ra­tio­nal­ity to it’s prac­ti­cally never x-ra­tio­nal­ity, and when I do ap­ply ev­ery­thing I’ve got I don’t seem to fol­low those de­ci­sions too con­sis­tently.

I’m not so great a ra­tio­nal­ist any­way, and I may be es­pe­cially bad at this. So I’m in­ter­ested in hear­ing how differ­ent your re­sults are. Just don’t rig it. If you find your­self us­ing x-ra­tio­nal­ity twenty times more of­ten than you were when you weren’t perform­ing the ex­per­i­ment, you’re rig­ging it, con­sciously or oth­er­wise5.

Eliezer writes:

The novice goes astray and says, “The Art failed me.”
The mas­ter goes astray and says, “I failed my Art.”

Yet one way to fail your Art is to ex­pect more of it than it can de­liver. No mat­ter how good a swim­mer you are, you will not be able to cross the Pa­cific. This is not to say cross­ing the Pa­cific is im­pos­si­ble. It just means it will re­quire a differ­ent sort of think­ing than the one you’ve been us­ing thus far. Per­haps there are de­vel­op­ments of the Art of Ra­tion­al­ity or its as­so­ci­ated Arts that can turn us into a Kel­lhus or a Galt, but they will not be reached by try­ing to over­come bi­ases re­ally re­ally hard.


1: Speci­fi­cally, read­ing Over­com­ing Bias con­vinced me to study evolu­tion­ary psy­chol­ogy in some depth, which has been use­ful in so­cial situ­a­tions. As far as I know. I’d prob­a­bly be bi­ased into think­ing it had been even if it hadn’t, be­cause I like evo psych and it’s very hard to mea­sure.

2: Eliezer con­sid­ers fight­ing akra­sia to be part of the art of ra­tio­nal­ity; he com­pares it to “kick­ing” to our “punch­ing”. I’m not sure why he con­sid­ers them to be the same Art rather than two re­lated Arts.

3: This is ac­tu­ally an im­por­tant point. I think there are prob­a­bly quite a few smart, suc­cess­ful peo­ple who de­velop an in­ter­est in x-ra­tio­nal­ity, but I can’t think of any peo­ple who started out merely above-av­er­age, de­vel­oped an in­ter­est in x-ra­tio­nal­ity, and then be­came smart and suc­cess­ful be­cause of that x-ra­tio­nal­ity.

4: This is a ter­ribly con­trol­led ex­per­i­ment, and the only way its data can be mean­ingfully in­ter­preted at all is through what one of my pro­fes­sors called the “oc­u­lar trauma test”—when the data hits you be­tween the eyes. If peo­ple claim they always fol­low their ra­tio­nal de­ci­sions, I think I will be more likely to in­ter­pret it as lack of enough cog­ni­tive self-con­scious­ness to no­tice when they’re do­ing some­thing ir­ra­tional than an hon­est lack of ir­ra­tional­ity.

5: In which case it will have ceased to be an ex­per­i­ment and be­come a tech­nique in­stead. I’ve no­ticed this hap­pen­ing a lot over the past few days, and I may con­tinue do­ing it.