Incremental Progress and the Valley

Yes­ter­day I said: “Ra­tion­al­ity is sys­tem­atized win­ning.”

“But,” you protest, “the rea­son­able per­son doesn’t always win!”

What do you mean by this? Do you mean that ev­ery week or two, some­one who bought a lot­tery ticket with nega­tive ex­pected value, wins the lot­tery and be­comes much richer than you? That is not a sys­tem­atic loss; it is se­lec­tive re­port­ing by the me­dia. From a statis­ti­cal stand­point, lot­tery win­ners don’t ex­ist—you would never en­counter one in your life­time, if it weren’t for the se­lec­tive re­port­ing.

Even perfectly ra­tio­nal agents can lose. They just can’t know in ad­vance that they’ll lose. They can’t ex­pect to un­der­perform any other performable strat­egy, or they would sim­ply perform it.

“No,” you say, “I’m talk­ing about how startup founders strike it rich by be­liev­ing in them­selves and their ideas more strongly than any rea­son­able per­son would. I’m talk­ing about how re­li­gious peo­ple are hap­pier—”

Ah. Well, here’s the the thing: An in­cre­men­tal step in the di­rec­tion of ra­tio­nal­ity, if the re­sult is still ir­ra­tional in other ways, does not have to yield in­cre­men­tally more win­ning.

The op­ti­mal­ity the­o­rems that we have for prob­a­bil­ity the­ory and de­ci­sion the­ory, are for perfect prob­a­bil­ity the­ory and de­ci­sion the­ory. There is no com­pan­ion the­o­rem which says that, start­ing from some flawed ini­tial form, ev­ery in­cre­men­tal mod­ifi­ca­tion of the al­gorithm that takes the struc­ture closer to the ideal, must yield an in­cre­men­tal im­prove­ment in perfor­mance. This has not yet been proven, be­cause it is not, in fact, true.

“So,” you say, “what point is there then in striv­ing to be more ra­tio­nal? We won’t reach the perfect ideal. So we have no guaran­tee that our steps for­ward are helping.”

You have no guaran­tee that a step back­ward will help you win, ei­ther. Guaran­tees don’t ex­ist in the world of flesh; but con­trary to pop­u­lar mis­con­cep­tions, judg­ment un­der un­cer­tainty is what ra­tio­nal­ity is all about.

“But we have sev­eral cases where, based on ei­ther vaguely plau­si­ble-sound­ing rea­son­ing, or sur­vey data, it looks like an in­cre­men­tal step for­ward in ra­tio­nal­ity is go­ing to make us worse off. If it’s re­ally all about win­ning—if you have some­thing to pro­tect more im­por­tant than any rit­ual of cog­ni­tion—then why take that step?”

Ah, and now we come to the meat of it.

I can’t nec­es­sar­ily an­swer for ev­ery­one, but...

My first rea­son is that, on a pro­fes­sional ba­sis, I deal with deeply con­fused prob­lems that make huge de­mands on pre­ci­sion of thought. One small mis­take can lead you astray for years, and there are worse penalties wait­ing in the wings. An unim­proved level of perfor­mance isn’t enough; my choice is to try to do bet­ter, or give up and go home.

“But that’s just you. Not all of us lead that kind of life. What if you’re just try­ing some or­di­nary hu­man task like an In­ter­net startup?”

My sec­ond rea­son is that I am try­ing to push some as­pects of my art fur­ther than I have seen done. I don’t know where these im­prove­ments lead. The loss of failing to take a step for­ward is not that one step, it is all the other steps for­ward you could have taken, be­yond that point. Robin Han­son has a say­ing: The prob­lem with slip­ping on the stairs is not fal­ling the height of the first step, it is that fal­ling one step leads to fal­ling an­other step. In the same way, re­fus­ing to climb one step up forfeits not the height of that step but the height of the stair­case.

“But again—that’s just you. Not all of us are try­ing to push the art into un­charted ter­ri­tory.”

My third rea­son is that once I re­al­ize I have been de­ceived, I can’t just shut my eyes and pre­tend I haven’t seen it. I have already taken that step for­ward; what use to deny it to my­self? I couldn’t be­lieve in God if I tried, any more than I could be­lieve the sky above me was green while look­ing straight at it. If you know ev­ery­thing you need to know in or­der to know that you are bet­ter off de­ceiv­ing your­self, it’s much too late to de­ceive your­self.

“But that re­al­iza­tion is un­usual; other peo­ple have an eas­ier time of dou­ble­think be­cause they don’t re­al­ize it’s im­pos­si­ble. You go around try­ing to ac­tively spon­sor the col­lapse of dou­ble­think. You, from a higher van­tage point, may know enough to ex­pect that this will make them un­hap­pier. So is this out of a sadis­tic de­sire to hurt your read­ers, or what?”

Then I fi­nally re­ply that my ex­pe­rience so far—even in this realm of merely hu­man pos­si­bil­ity—does seem to in­di­cate that, once you sort your­self out a bit and you aren’t do­ing quite so many other things wrong, striv­ing for more ra­tio­nal­ity ac­tu­ally will make you bet­ter off. The long road leads out of the valley and higher than be­fore, even in the hu­man lands.

The more I know about some par­tic­u­lar facet of the Art, the more I can see this is so. As I’ve pre­vi­ously re­marked, my es­says may be un­re­flec­tive of what a true mar­tial art of ra­tio­nal­ity would be like, be­cause I have only fo­cused on an­swer­ing con­fus­ing ques­tions—not fight­ing akra­sia, co­or­di­nat­ing groups, or be­ing happy. In the field of an­swer­ing con­fus­ing ques­tions—the area where I have most in­tensely prac­ticed the Art—it now seems mas­sively ob­vi­ous that any­one who thought they were bet­ter off “stay­ing op­ti­mistic about solv­ing the prob­lem” would get stomped into the ground. By a ca­sual stu­dent.

When it comes to keep­ing mo­ti­vated, or be­ing happy, I can’t guaran­tee that some­one who loses their illu­sions will be bet­ter off—be­cause my knowl­edge of these facets of ra­tio­nal­ity is still crude. If these parts of the Art have been de­vel­oped sys­tem­at­i­cally, I do not know of it. But even here I have gone to some con­sid­er­able pains to dis­pel half-ra­tio­nal half-mis­taken ideas that could get in a be­gin­ner’s way, like the idea that ra­tio­nal­ity op­poses feel­ing, or the idea that ra­tio­nal­ity op­poses value, or the idea that so­phis­ti­cated thinkers should be angsty and cyn­i­cal.

And if, as I hope, some­one goes on to de­velop the art of fight­ing akra­sia or achiev­ing men­tal well-be­ing as thor­oughly as I have de­vel­oped the art of an­swer­ing im­pos­si­ble ques­tions, I do fully ex­pect that those who wrap them­selves in their illu­sions will not be­gin to com­pete. Mean­while—oth­ers may do bet­ter than I, if hap­piness is their dear­est de­sire, for I my­self have in­vested lit­tle effort here.

I find it hard to be­lieve that the op­ti­mally mo­ti­vated in­di­vi­d­ual, the strongest en­trepreneur a hu­man be­ing can be­come, is still wrapped up in a blan­ket of com­fort­ing over­con­fi­dence. I think they’ve prob­a­bly thrown that blan­ket out the win­dow and or­ga­nized their mind a lit­tle differ­ently. I find it hard to be­lieve that the hap­piest we can pos­si­bly live, even in the realms of hu­man pos­si­bil­ity, in­volves a tiny aware­ness lurk­ing in the cor­ner of your mind that it’s all a lie. I’d rather stake my hopes on neu­ro­feed­back or Zen med­i­ta­tion, though I’ve tried nei­ther.

But it can­not be de­nied that this is a very real is­sue in very real life. Con­sider this pair of com­ments from Less Wrong:

I’ll be hon­est —my life has taken a sharp down­turn since I de­con­verted. My the­ist girlfriend, with whom I was very much in love, couldn’t deal with this change in me, and af­ter six months of painful vac­illa­tion, she left me for a co-worker. That was an­other six months ago, and I have been heart­bro­ken, mis­er­able, un­fo­cused, and ex­tremely in­effec­tive since.

Per­haps this is an ex­am­ple of the valley of bad ra­tio­nal­ity of which PhilGoetz spoke, but I still hold my cur­rent situ­a­tion higher in my prefer­ence rank­ing than hap­piness with false be­liefs.

And:

My em­pathies: that hap­pened to me about 6 years ago (though thank­fully with­out as much visi­ble vac­illa­tion).

My sister, who had some Cog­ni­tive Be­havi­our Ther­apy train­ing, re­minded me that re­la­tion­ships are form­ing and break­ing all the time, and given I wasn’t unattrac­tive and hadn’t re­treated into monas­tic seclu­sion, it wasn’t ra­tio­nal to think I’d be alone for the rest of my life (she turned out to be right). That was helpful at the times when my feel­ings hadn’t com­pletely got the bet­ter of me.

So—in prac­tice, in real life, in sober fact—those first steps can, in fact, be painful. And then things can, in fact, get bet­ter. And there is, in fact, no guaran­tee that you’ll end up higher than be­fore. Even if in prin­ci­ple the path must go fur­ther, there is no guaran­tee that any given per­son will get that far.

If you don’t pre­fer truth to hap­piness with false be­liefs...

Well… and if you are not do­ing any­thing es­pe­cially pre­car­i­ous or con­fus­ing… and if you are not buy­ing lot­tery tick­ets… and if you’re already signed up for cry­on­ics, a sud­den ul­tra-high-stakes con­fus­ing acid test of ra­tio­nal­ity that illus­trates the Black Swan qual­ity of try­ing to bet on ig­no­rance in ig­no­rance...

Then it’s not guaran­teed that tak­ing all the in­cre­men­tal steps to­ward ra­tio­nal­ity that you can find, will leave you bet­ter off. But the vaguely plau­si­ble-sound­ing ar­gu­ments against los­ing your illu­sions, gen­er­ally do con­sider just one sin­gle step, with­out pos­tu­lat­ing any fur­ther steps, with­out sug­gest­ing any at­tempt to re­gain ev­ery­thing that was lost and go it one bet­ter. Even the sur­veys are com­par­ing the av­er­age re­li­gious per­son to the av­er­age athe­ist, not the most ad­vanced the­olo­gians to the most ad­vanced ra­tio­nal­ists.

But if you don’t care about the truth—and you have noth­ing to pro­tect—and you’re not at­tracted to the thought of push­ing your art as far as it can go—and your cur­rent life seems to be go­ing fine—and you have a sense that your men­tal well-be­ing de­pends on illu­sions you’d rather not think about—

Then you’re prob­a­bly not read­ing this. But if you are, then, I guess… well… (a) sign up for cry­on­ics, and then (b) stop read­ing Less Wrong be­fore your illu­sions col­lapse! RUN AWAY!