Is Rationalist Self-Improvement Real?

Cross-posted from Pu­tanu­monit where the images show up way big­ger. I don’t know how to make them big­ger on LW.


Basketballism

Imag­ine that to­mor­row ev­ery­one on the planet for­gets the con­cept of train­ing bas­ket­ball skills.

The next day ev­ery­one is as good at bas­ket­ball as they were the pre­vi­ous day, but this tal­ent is as­sumed to be fixed. No one ex­pects their perfor­mance to change over time. No one teaches bas­ket­ball, al­though many peo­ple con­tinue to play the game for fun.

Snap­shots of a Steph Curry jump shot. Image credit: ESPN.

Ge­net­i­cists ex­plain that some peo­ple are born with bet­ter hand-eye co­or­di­na­tion and are thus able to shoot a bas­ket­ball ac­cu­rately. Economists ex­plain that highly-paid NBA play­ers have a stronger in­cen­tive to hit shots, which ex­plains their im­proved perfor­mance. Psy­chol­o­gists note that peo­ple who take more jump shots each day hit a higher per­centage and the­o­rize a prin­ci­pal fac­tor of bas­ket­ball af­finity that in­fluences both de­sire and skill at bas­ket­ball. Crit­i­cal the­o­rists claim that white men’s un­der-rep­re­sen­ta­tion in the NBA is due to sys­temic op­pres­sion.

Papers are pub­lished, tenure is awarded.

New sci­en­tific dis­ci­plines emerge and be­gin study­ing jump shots more sys­tem­at­i­cally. Evolu­tion­ary phys­iol­o­gists point out that our an­ces­tors threw stones more of­ten than they tossed bas­ket­balls, which ex­plains our lack of adap­ta­tion to that par­tic­u­lar mo­tion. Be­hav­ioral ki­ne­siol­o­gists de­scribe sys­tem­atic bi­ases in hu­man bas­ket­ball, such as the ten­dency to shoot balls with a flat­ter tra­jec­tory and a lower re­lease point than is op­ti­mal.

When asked by as­piring bas­ket­ball play­ers if jump shots can be im­proved, they all shake their heads sadly and say that it is hu­man na­ture to miss shots. A No­bel lau­re­ate be­hav­ioral ki­ne­siol­o­gist tells au­di­ences that even af­ter he wrote a book on bi­ases in bas­ket­ball his shot did not im­prove much. Some­one pub­lishes a study show­ing that bas­ket­ball perfor­mance im­proves af­ter a one-hour train­ing ses­sion with schoolchil­dren, but Shott Ballexan­der writes a crit­i­cal take­down point­ing out that the effect wore off af­ter a month and could sim­ply be ran­dom noise. The field switches to study­ing “nudges”: ways to de­sign sys­tems so that play­ers hit more shots at the same level of skill. They recom­mend that the NBA adopt larger hoops.

Papers are pub­lished, tenure is awarded.

Then, one day, a woman sits down to read those pa­pers who is not an aca­demic, just some­one look­ing to get good at bas­ket­ball. She re­al­izes that the les­sons of be­hav­ioral ki­ne­siol­ogy can be used to im­prove her jump shot, and prac­tices re­leas­ing the ball at the top of her jump from above the fore­head. Her economist friend re­minds her to give the ball more arc. As the balls start swoosh­ing in, more peo­ple gather at the gym to prac­tice shoot­ing. They call them­selves Bas­ket­bal­lists.

Most peo­ple who walk past the gym sneer at the Bas­ket­bal­lists. “You call your­selves Bas­ket­bal­lists and yet none of you shoots 100%”, they taunt. “You should go to grad school if you want to learn about jump shots.” Some of Bas­ket­bal­lists them­selves be­gin to doubt the pro­ject, es­pe­cially since switch­ing to the new shoot­ing tech­niques low­ers their perfor­mance at first. “Did you hear what the Cen­ter for Ap­plied Bas­ket­ball is charg­ing for a train­ing camp?” they mut­ter. “I bet it doesn’t even work.”

Within a few years, some ded­i­cated Bas­ket­bal­lists start talk­ing about how much their shot per­centage im­proved and the pick-up tour­na­ments they won. Most peo­ple say it’s just se­lec­tion bias, or dis­miss them by ask­ing why they can’t out­play Kawhi Leonard for all their train­ing.

The Bas­ket­bal­lists in­sist that the train­ing does help, that they re­ally get bet­ter by the day. But how could they know?

AsWrongAsEver

A core ax­iom of Ra­tion­al­ity is that it is a skill that can be im­proved with time and prac­tice. The very names Over­com­ing Bias and Less Wrong re­flect this: ra­tio­nal­ity is a vec­tor, not a fixed point.

A core foun­da­tion of Ra­tion­al­ity is the re­search on heuris­tic and bi­ases led by Daniel Kah­ne­man. The very first book in The Se­quences is in large part a sum­mary of Kah­ne­man’s work.

Awk­wardly for Ra­tion­al­ists, Daniel Kah­ne­man is hugely skep­ti­cal of any pos­si­ble im­prove­ment in ra­tio­nal­ity, es­pe­cially for whole groups of peo­ple. In an as­ton­ish­ing in­ter­view with Sam Har­ris, Kah­ne­man de­scribes bias af­ter bias in hu­man think­ing, emo­tions, and de­ci­sion mak­ing. For ev­ery one, Sam asks: how do we get bet­ter at this? And for ev­ery one, Daniel replies: we don’t, we’ve been tel­ling peo­ple about this for decades and noth­ing has changed, that’s just how peo­ple are.

Daniel Kah­ne­man is fa­mil­iar with CFAR but as far as I know, he has not put as much effort him­self into de­vel­op­ing a com­mu­nity and cur­ricu­lum ded­i­cated to im­prov­ing hu­man ra­tio­nal­ity. He has dis­cov­ered and de­scribed hu­man ir­ra­tional­ity, mostly to an au­di­ence of psy­chol­ogy un­der­grads. And psy­chol­ogy un­der­grads do worse than pi­geons at learn­ing a sim­ple prob­a­bil­is­tic game so we shouldn’t ex­pect them to learn ra­tio­nal­ity just by read­ing about bi­ases. Per­haps if they started read­ing Slate Star Codex…

Alas, Scott Alexan­der him­self is quite skep­ti­cal of Ra­tion­al­ist self-im­prove­ment. He cer­tainly be­lieves that Ra­tion­al­ist think­ing can help you make good pre­dic­tions and oc­ca­sion­ally dis­t­in­guish truth from bul­lshit, but he’s un­con­vinced that the un­der­ly­ing abil­ity can be im­proved upon. Scott is even more skep­ti­cal of Ra­tion­al­ity’s use for life-op­ti­miza­tion.

I told Scott that I credit Ra­tion­al­ity with a lot of the mas­sive im­prove­ments in my fi­nan­cial, so­cial, ro­man­tic, and men­tal life that hap­pened to co­in­cide with my dis­cov­ery of LessWrong. Scott ar­gued that I would do equally well in the ab­sence of Ra­tion­al­ity by find­ing other self-im­prove­ment philoso­phies to pour my in­tel­li­gence and mo­ti­va­tion into and that these two are the root cause of my life get­ting bet­ter. Scott also seems to have been do­ing very well since he dis­cov­ered LessWrong, but he cred­its Ra­tion­al­ity with not much more than be­ing a flag that united the com­mu­nity he’s part of.

So: on one side are Yud­kowsky, CFAR, and sev­eral Ra­tion­al­ists, shar­ing the be­lief that Ra­tion­al­ity is a learn­able skill that can im­prove the lives of most seek­ers who step on the path. On the other side are Kah­ne­man, Scott, sev­eral other Ra­tion­al­ists, and all anti-Ra­tion­al­ists, who dis­agree.

When I sur­veyed my Twit­ter fol­low­ers, the re­sults dis­tributed some­what pre­dictably:



The op­ti­mistic take is that RSI works for most peo­ple if they only tried it. The neu­tral take is that peo­ple are good at try­ing self-im­prove­ment philoso­phies that would work for them. The pes­simistic take is that Ra­tion­al­ists are de­luded by sunk cost and con­fir­ma­tion bias.

Who’s right? Is Ra­tion­al­ity train­able like jump shots or fixed like height? Be­fore reach­ing any con­clu­sions, let’s try to figure out how why so many smart peo­ple who are equally fa­mil­iar with Ra­tion­al­ity dis­agree so strongly about this im­por­tant ques­tion.

Great Expectations

An im­por­tant crux of dis­agree­ment be­tween me and Scott is in the ques­tion of what counts as suc­cess­ful Ra­tion­al­ist self-im­prove­ment. We can both look at the same facts and come to very differ­ent con­clu­sions re­gard­ing the util­ity of Ra­tion­al­ity.

Here’s how Scott parses the fact that 15% of SSC read­ers who were referred by LessWrong have made over $1,000 by in­vest­ing in cryp­tocur­rency and 3% made over $100,000:

The first men­tion of Bit­coin on Less Wrong, a post called Mak­ing Money With Bit­coin, was in early 2011 – when it was worth 91 cents. Gw­ern pre­dicted that it could some­day be worth “up­wards of $10,000 a bit­coin”. […]
This was the eas­iest test case of our “make good choices” abil­ity that we could pos­si­bly have got­ten, the one where a mul­ti­ply-your-money-by-a-thou­sand-times op­por­tu­nity ba­si­cally fell out of the sky and hit our com­mu­nity on its col­lec­tive head. So how did we do?
I would say we did mediocre. […]
Over­all, if this was a test for us, I give the com­mu­nity a C and me per­son­ally an F. God ar­ranged for the perfect op­por­tu­nity to fall into our lap. We vaguely con­verged onto the right an­swer in an epistemic sense. And 3 – 15% of us, not in­clud­ing me, ac­tu­ally took ad­van­tage of it and got some­what rich.

Here’s how I would de­scribe it:

Of the 1289 peo­ple who were referred to SSC from LessWrong, two thirds are younger than 30, a third are stu­dents/​in­terns or oth­er­wise yet to start their ca­reers, and many are for other rea­sons too broke for it to be ac­tu­ally ra­tio­nal to risk even $100 on some­thing that you saw recom­mended on a blog. Of the re­main­der, the ma­jor­ity were not around in the early days when cryp­tocur­ren­cies were dis­cussed — the me­dian “time in com­mu­nity” on LessWrong sur­veys is around two years. In any case, “in­vest in crypto” was never a ma­jor theme or uni­ver­sally en­dorsed in the Ra­tion­al­ist com­mu­nity.
Of those that were around and had the money to in­vest early enough, a lot lost it all when Mt. Gox was hacked or when Bit­coin crashed in late 2013 and didn’t re­cover un­til 2017 or through sev­eral other con­tin­gen­cies.
If I had to guess the per­cent of Ra­tion­al­ists who were even in a po­si­tion to learn about crypto on LessWrong and make more than $1,000 by fol­low­ing Ra­tion­al­ist ad­vice, I’d say it’s cer­tainly less than 50%. Maybe not much larger than 15%.
Only 8% of Amer­i­cans own cryp­tocur­rency to­day. At the ab­solute high­est end es­ti­mate, 1% of Amer­i­cans, and 0.1% of peo­ple wor­ld­wide, made >$1,000 from crypto. So Ra­tion­al­ists did at least an or­der of mag­ni­tude bet­ter than the gen­eral pop­u­la­tion, al­most as well as they could’ve done in a perfect world, and also funded MIRI and CFAR with Bit­coin for years ahead. I give the com­mu­nity an A and my­self an A.

In an es­say called Ex­treme Ra­tion­al­ity: It’s Not That Great Scott writes:

Eliezer writes:
The novice goes astray and says, “The Art failed me.”
The mas­ter goes astray and says, “I failed my Art.”
Yet one way to fail your Art is to ex­pect more of it than it can de­liver.

Scott means to say that Eliezer ex­pects too much of the art in de­mand­ing that great Ra­tion­al­ist teach­ers be great at other things as well. But I think that ex­pect­ing 50% of LessWrongers filling out a sur­vey to have made thou­sands of dol­lars from crypto is set­ting the bar far higher than Eliezer’s crite­rion of “Be­ing a math pro­fes­sor at a small uni­ver­sity who has pub­lished a few origi­nal proofs, or a suc­cess­ful day trader who re­tired af­ter five years to be­come an or­ganic farmer, or a se­rial en­trepreneur who lived through three failed star­tups be­fore go­ing back to a more or­di­nary job as a se­nior pro­gram­mer.”

Akrasia

Scott blames the failure of Ra­tion­al­ity to help pri­mar­ily on akra­sia.

One fac­tor we have to once again come back to is akra­sia. I find akra­sia in my­self and oth­ers to be the most im­por­tant limit­ing fac­tor to our suc­cess. Think of that phrase “limit­ing fac­tor” for­mally, the way you’d think of the limit­ing reagent in chem­istry. When there’s a limit­ing reagent, it doesn’t mat­ter how much more of the other reagents you add, the re­ac­tion’s not go­ing to make any more product. Ra­tional de­ci­sions are prac­ti­cally use­less with­out the willpower to carry them out. If our limit­ing reagent is willpower and not ra­tio­nal­ity, throw­ing truck­loads of ra­tio­nal­ity into our brains isn’t go­ing to in­crease suc­cess very much.

I take this para­graph to im­ply a model that looks like this:

[Alex reads LessWrong] → [Alex tries to be­come less wrong] → [akra­sia] → [Alex doesn’t im­prove].

I would make a small change to this model:

[Alex reads LessWrong] → [akra­sia] → [Alex doesn’t try to be­come less wrong] → [Alex doesn’t im­prove].

A lot of LessWrong is very fun to read, as is all of SlateS­tarCodex. A large num­ber of peo­ple on these sites, as on Pu­tanu­monit, are just look­ing to pro­cras­ti­nate dur­ing the work­day, not to change how their mind works. Only 7% of the peo­ple who were en­gaged enough to fill out the last LessWrong sur­vey have at­tended a CFAR work­shop. Only 20% ever wrote a post, which is some mea­sure of ac­tive rather than pas­sive en­gage­ment with the ma­te­rial.

In con­trast, one per­son wrote a se­quence on try­ing out ap­plied ra­tio­nal­ity for 30 days straight: Xiaoyu “The Ham­mer” He. And he was quite satis­fied with the re­sult.

I’m not sure that Scott and I dis­agree much, but I didn’t get the sense that his es­say was say­ing “just read­ing about this stuff doesn’t help, you have to ac­tu­ally try”. It also doesn’t ex­plain was he was so skep­ti­cal about me cred­it­ing my own im­prove­ment to Ra­tion­al­ity.

Akra­sia is dis­cussed a lot on LessWrong, and ap­plied ra­tio­nal­ity has sev­eral tools that help with it. What works for me and my smart friends is not to try and gen­er­ate willpower but to use lu­cid mo­ments to de­sign plans that take a lack of willpower into ac­count. Other ap­proaches work for other peo­ple. But of course, if some­one lacks the willpower to even try and take Ra­tion­al­ity im­prove­ment se­ri­ously, a mere blog post will not help them.

3% LessWrong

Scott also high­lights the key sen­tence in his es­say:

I think it may help me suc­ceed in life a lit­tle, but I think the cor­re­la­tion be­tween x-ra­tio­nal­ity and suc­cess is prob­a­bly closer to 0.1 than to 1.

What he doesn’t ask him­self is: how big is a cor­re­la­tion of 0.1?

Here’s the chart of re­spon­dents to the SlateS­tarCodex sur­vey, by self-re­ported yearly in­come and whether they were referred from LessWrong (Scott’s crite­rion for Ra­tion­al­ists).

And here’s the same chart with a small change. Can you no­tice it?

For the sec­ond chart, I in­creased the in­come of all ra­tio­nal­ists by 25%.

The fol­low­ing things are both true:

  • When you eye­ball the group as a whole, the charts look iden­ti­cal. A 25% im­prove­ment for a quar­ter of the peo­ple in a group you ob­serve is barely no­tice­able. The rich stayed rich, the poor stayed poor.

  • If your own in­come in­creased 25% you would cer­tainly no­tice it. And if the in­crease came as a re­sult of read­ing a few blog posts and com­ing to a few mee­tups, you would tell ev­ery­one you know about this as­tound­ing life hack.

The cor­re­la­tion be­tween Ra­tion­al­ity and in­come in Scott’s sur­vey is −0.01. That num­ber goes up to a mere 0.02 af­ter the in­crease. A cor­re­la­tion of 0.1 is ab­solutely huge, it would re­quire tripling the in­come of all Ra­tion­al­ists.

The point isn’t to nit­pick Scott’s choice of “cor­re­la­tion = 0.1” as a metaphor. But ev­ery mea­sure of suc­cess we care about, like im­pact on the world or pop­u­lar­ity or en­light­en­ment, is prob­a­bly dis­tributed like in­come is on the sur­vey. And so if Ra­tion­al­ity made you 25% more suc­cess­ful it wouldn’t be as ob­vi­ously visi­ble as Scott thinks it would be — es­pe­cially since ev­ery­one pur­sues a differ­ent vi­sion of suc­cess. In this 25% world, the most and least suc­cess­ful peo­ple would still be such for rea­sons other than Ra­tion­al­ity. And in this world, Ra­tion­al­ity would be one of the most effec­tive self-im­prove­ment ap­proaches ever de­vised. 25% is a lot!

Of course, the 25% in­crease wouldn’t hap­pen im­me­di­ately. Most peo­ple who take Ra­tion­al­ity se­ri­ously have been in the com­mu­nity for sev­eral years. You get to 25% im­prove­ment by get­ting 3% bet­ter each year for 8 years.

Here’s what 3% im­prove­ment feels like:

You know what feels crappy? 3% im­prove­ment. You busted your ass for a year, try­ing to get bet­ter at dat­ing, at be­ing less of an in­tro­vert, at self-sooth­ing your anx­iety – and you only man­aged to get 3% bet­ter at it.
If you worked a job where you put in that much time at the office and they gave you a measly 3% raise, you would spit in your boss’s face and walk the fuck out.
And, in fact, that’s what most peo­ple do: quit. […]
The model for most self-im­prove­ment is usu­ally this:
* You don’t have much of a prob­lem
* You found The Break­through that erased all the is­sues you had
* When you’re done, you’ll be the op­po­site of what you were. Used to be bad at dat­ing? Now you’ll have your own per­sonal harem. Used to be use­less at small talk? Now you’re a fluent racon­teur.
Which, when you’ve ag­o­nized to scrape to­gether a measly 3% im­prove­ment, feels like crap. If you’re bur­dened with such so­cial anx­iety that it takes liter­ally ev­ery­thing you have to go out in pub­lic for twenty min­utes, make one awk­ward small talk, and then re­treat home to col­lapse in em­bar­rass­ment, you think, “Well, this isn’t worth it.”
But most self-im­prove­ment isn’t im­me­di­ate im­prove­ment, my friend.
It’s com­pound in­ter­est.

I think that Ra­tion­al­ist self-im­prove­ment is like this. You don’t get bet­ter at life and ra­tio­nal­ity af­ter tak­ing one class with Prof. Kah­ne­mann. After 8 years of hard work, you don’t stand out from the crowd even as the re­sults be­come per­son­ally no­tice­able. And if you dis­cover Ra­tion­al­ity in col­lege and stick with it, by the time you’re 55 you will be three times bet­ter than what you would have been if you hadn’t com­pounded these 3% gains year af­ter year, and ev­ery­one will no­tice that.

What’s more, the out­comes don’t scale smoothly with your level of skill. When rare, high lev­er­age op­por­tu­ni­ties come around, be­ing slightly more ra­tio­nal can make a huge differ­ence. Bit­coin was one such op­por­tu­nity; meet­ing my wife was an­other such one for me. I don’t know what the next one will be: an emerg­ing tech­nol­ogy startup? a poli­ti­cal up­heaval? cry­on­ics? I know that the world is get­ting weirder faster, and the pay­outs to Ra­tion­al­ity are go­ing to in­crease com­men­su­rately.

Here’s what Scott him­self wrote in re­sponse to a critic of Bayesi­anism:

Prob­a­bil­ity the­ory in gen­eral, and Bayesi­anism in par­tic­u­lar, provide a co­her­ent philo­soph­i­cal foun­da­tion for not be­ing an idiot.
Now in gen­eral, peo­ple don’t need co­her­ent philo­soph­i­cal foun­da­tions for any­thing they do. They don’t need gram­mar to speak a lan­guage, they don’t need clas­si­cal physics to hit a base­ball, and they don’t need prob­a­bil­ity the­ory to make good de­ci­sions. This is why I find all the “But prob­a­bil­ity the­ory isn’t that use­ful in ev­ery­day life!” com­plain­ing so vac­u­ous.
“Every­day life” means “in­side your com­fort zone”. You don’t need the­ory in­side your com­fort zone, be­cause you already nav­i­gate it effortlessly. But some­times you find that the in­side of your com­fort zone isn’t so com­fortable af­ter all (my go-to gram­mat­i­cal ex­am­ple is an­swer­ing the phone “Scott? Yes, this is him.”) Other times you want to leave your com­fort zone, by for ex­am­ple speak­ing a for­eign lan­guage or cre­at­ing a con­lang.
When David says that “You can’t pos­si­bly be an athe­ist be­cause…” doesn’t count be­cause it’s an edge case, I re­spond that it’s ex­actly the sort of thing that should count be­cause it’s peo­ple try­ing to ac­tu­ally think about an is­sue out­side their com­fort zone which they can’t han­dle on in­tu­ition alone. It turns out when most peo­ple try this they fail mis­er­ably. If you are the sort of per­son who likes to deal with com­pli­cated philo­soph­i­cal prob­lems out­side the com­fortable area where you can rely on in­stinct – and poli­tics, re­li­gion, philos­o­phy, and char­ity all fall in that area – then it’s re­ally nice to have an episte­mol­ogy that doesn’t suck.

If you’re the sort of per­son for whom suc­cess in life means step­ping out­side the com­fort zone that your par­ents and high school coun­selor charted out for you, if you’re will­ing to ex­plore spaces of con­scious­ness and re­la­tion­ships that other peo­ple warn you about, if you com­pare your­self only to who you were yes­ter­day and not to who some­one else is to­day… If you’re weird like me, and if you’re read­ing this you prob­a­bly are, I think that Ra­tion­al­ity can im­prove your life a lot.

But to get bet­ter at bas­ket­ball, you have to ac­tu­ally show up to the gym.


See also: The Mar­tial Art of Ra­tion­al­ity.