Resist the Happy Death Spiral

Once upon a time, there was a man who was con­vinced that he pos­sessed a Great Idea. In­deed, as the man thought upon the Great Idea more and more, he re­al­ized that it was not just a great idea, but the most won­der­ful idea ever. The Great Idea would un­ravel the mys­ter­ies of the uni­verse, su­per­sede the au­thor­ity of the cor­rupt and er­ror-rid­den Estab­lish­ment, con­fer nigh-mag­i­cal pow­ers upon its wielders, feed the hun­gry, heal the sick, make the whole world a bet­ter place, etc., etc., etc.

The man was Fran­cis Ba­con, his Great Idea was the sci­en­tific method, and he was the only crack­pot in all his­tory to claim that level of benefit to hu­man­ity and turn out to be com­pletely right.1

That’s the prob­lem with de­cid­ing that you’ll never ad­mire any­thing that much: Some ideas re­ally are that good. Though no one has fulfilled claims more au­da­cious than Ba­con’s; at least, not yet.

But then how can we re­sist the happy death spiral with re­spect to Science it­self? The happy death spiral starts when you be­lieve some­thing is so won­der­ful that the halo effect leads you to find more and more nice things to say about it, mak­ing you see it as even more won­der­ful, and so on, spiral­ing up into the abyss. What if Science is in fact so benefi­cial that we can­not ac­knowl­edge its true glory and re­tain our san­ity? Sounds like a nice thing to say, doesn’t it? Oh no it’s start­ing ru­u­unnnnn . . .

If you re­trieve the stan­dard cached deep wis­dom for don’t go over­board on ad­miring sci­ence, you will find thoughts like “Science gave us air con­di­tion­ing, but it also made the hy­dro­gen bomb” or “Science can tell us about stars and biol­ogy, but it can never prove or dis­prove the dragon in my garage.” But the peo­ple who origi­nated such thoughts were not try­ing to re­sist a happy death spiral. They weren’t wor­ry­ing about their own ad­mira­tion of sci­ence spin­ning out of con­trol. Prob­a­bly they didn’t like some­thing sci­ence had to say about their pet be­liefs, and sought ways to un­der­mine its au­thor­ity.

The stan­dard nega­tive things to say about sci­ence aren’t likely to ap­peal to some­one who gen­uinely feels the ex­ul­ta­tion of sci­ence—that’s not the in­tended au­di­ence. So we’ll have to search for other nega­tive things to say in­stead.

But if you look se­lec­tively for some­thing nega­tive to say about sci­ence—even in an at­tempt to re­sist a happy death spiral—do you not au­to­mat­i­cally con­vict your­self of ra­tio­nal­iza­tion? Why would you pay at­ten­tion to your own thoughts, if you knew you were try­ing to ma­nipu­late your­self?

I am gen­er­ally skep­ti­cal of peo­ple who claim that one bias can be used to coun­ter­act an­other. It sounds to me like an au­to­mo­bile me­chanic who says that the mo­tor is bro­ken on your right wind­shield wiper, but in­stead of fix­ing it, they’ll just break your left wind­shield wiper to bal­ance things out. This is the sort of clev­er­ness that leads to shoot­ing your­self in the foot. What­ever the solu­tion, it ought to in­volve be­liev­ing true things, rather than be­liev­ing you be­lieve things that you be­lieve are false.

Can you pre­vent the happy death spiral by re­strict­ing your ad­mira­tion of Science to a nar­row do­main? Part of the happy death spiral is see­ing the Great Idea ev­ery­where—think­ing about how Com­mu­nism could cure can­cer if it were only given a chance. Prob­a­bly the sin­gle most re­li­able sign of a cult guru is that the guru claims ex­per­tise, not in one area, not even in a cluster of re­lated ar­eas, but in ev­ery­thing. The guru knows what cult mem­bers should eat, wear, do for a liv­ing; who they should have sex with; which art they should look at; which mu­sic they should listen to . . .

Un­for­tu­nately for this plan, most peo­ple fail mis­er­ably when they try to de­scribe the neat lit­tle box that sci­ence has to stay in­side. The usual trick, “Hey, sci­ence won’t cure can­cer,” isn’t go­ing to fly. “Science has noth­ing to say about a par­ent’s love for their child”—sorry, that’s sim­ply false. If you try to sever sci­ence from e.g. parental love, you aren’t just deny­ing cog­ni­tive sci­ence and evolu­tion­ary psy­chol­ogy. You’re also deny­ing Mar­tine Roth­blatt’s found­ing of United Ther­a­peu­tics to seek a cure for her daugh­ter’s pul­monary hy­per­ten­sion.2 Science is le­gi­t­i­mately re­lated, one way or an­other, to just about ev­ery im­por­tant facet of hu­man ex­is­tence.

All right, so what’s an ex­am­ple of a false nice claim you could make about sci­ence?

One false claim, in my hum­ble opinion, is that sci­ence is so won­der­ful that sci­en­tists shouldn’t even try to take eth­i­cal re­spon­si­bil­ity for their work—it will turn out well in the end re­gard­less. It ap­pears to me that this mi­s­un­der­stands the pro­cess whereby sci­ence benefits hu­man­ity. Scien­tists are hu­man; they have proso­cial con­cerns just like most other other peo­ple, and this is at least part of why sci­ence ends up do­ing more good than evil.

But that point is, ev­i­dently, not be­yond dis­pute. So here’s a sim­pler false nice claim: “A can­cer pa­tient can be cured just through the pub­lish­ing of enough jour­nal pa­pers.” Or: “So­ciopaths could be­come fully nor­mal, if they just com­mit­ted them­selves to never be­liev­ing any­thing with­out repli­cated ex­per­i­men­tal ev­i­dence with p < 0.05.”

The way to avoid be­liev­ing such state­ments isn’t an af­fec­tive cap, de­cid­ing that sci­ence is only slightly nice. Nor search­ing for rea­sons to be­lieve that pub­lish­ing jour­nal ar­ti­cles causes can­cer. Nor be­liev­ing that sci­ence has noth­ing to say about can­cer one way or the other.

Rather, if you know with enough speci­fic­ity how sci­ence works, then you know that while it may be pos­si­ble for “sci­ence to cure can­cer,” a can­cer pa­tient writ­ing jour­nal pa­pers isn’t go­ing to ex­pe­rience a mirac­u­lous re­mis­sion. That spe­cific pro­posed chain of cause and effect is not go­ing to work out.

The happy death spiral is only an emo­tional prob­lem be­cause of a per­cep­tual prob­lem, the halo effect, that makes us more likely to ac­cept fu­ture pos­i­tive claims once we’ve ac­cepted an ini­tial pos­i­tive claim. We can’t get rid of this effect just by wish­ing; it will prob­a­bly always in­fluence us a lit­tle. But we can man­age to slow down, stop, con­sider each ad­di­tional nice claim as an ad­di­tional bur­den­some de­tail, and fo­cus on the spe­cific points of the claim apart from its pos­i­tive­ness.

What if a spe­cific nice claim “can’t be dis­proven” but there are ar­gu­ments “both for and against” it? Ac­tu­ally these are words to be wary of in gen­eral, be­cause of­ten this is what peo­ple say when they’re re­hears­ing the ev­i­dence or avoid­ing the real weak points. Given the dan­ger of the happy death spiral, it makes sense to try to avoid be­ing happy about un­set­tled claims—to avoid mak­ing them into a source of yet more pos­i­tive af­fect about some­thing you liked already.

The happy death spiral is only a big emo­tional prob­lem be­cause of the overly pos­i­tive feed­back, the abil­ity for the pro­cess to go crit­i­cal. You may not be able to elimi­nate the halo effect en­tirely, but you can ap­ply enough crit­i­cal rea­son­ing to keep the halos sub­crit­i­cal—make sure that the res­o­nance dies out rather than ex­plod­ing.

You might even say that the whole prob­lem starts with peo­ple not both­er­ing to crit­i­cally ex­am­ine ev­ery ad­di­tional bur­den­some de­tail—de­mand­ing suffi­cient ev­i­dence to com­pen­sate for com­plex­ity, search­ing for flaws as well as sup­port, in­vok­ing cu­ri­os­ity—once they’ve ac­cepted some core premise. Without the con­junc­tion fal­lacy, there might still be a halo effect, but there wouldn’t be a happy death spiral.3

Even on the nicest Nice Thin­gies in the known uni­verse, a perfect ra­tio­nal­ist who de­manded ex­actly the nec­es­sary ev­i­dence for ev­ery ad­di­tional (pos­i­tive) claim would ex­pe­rience no af­fec­tive res­o­nance. You can’t do this, but you can stay close enough to ra­tio­nal to keep your hap­piness from spiral­ing out of con­trol.4

Stu­art Arm­strong gives closely re­lated ad­vice:5

Cut up your Great Thingy into smaller in­de­pen­dent ideas, and treat them as in­de­pen­dent.

For in­stance a marx­ist would cut up Marx’s Great Thingy into a the­ory of value of labour, a the­ory of the poli­ti­cal re­la­tions be­tween classes, a the­ory of wages, a the­ory on the ul­ti­mate poli­ti­cal state of mankind. Then each of them should be as­sessed in­de­pen­dently, and the truth or falsity of one should not halo on the oth­ers. If we can do that, we should be safe from the spiral, as each the­ory is too nar­row to start a spiral on its own.

This, metaphor­i­cally, is like keep­ing sub­crit­i­cal masses of plu­to­nium from com­ing to­gether. Three Great Ideas are far less likely to drive you mad than one Great Idea. Arm­strong’s ad­vice also helps pro­mote speci­fic­ity: As soon as some­one says, “Pub­lish­ing enough pa­pers can cure your can­cer,” you ask, “Is that a benefit of the ex­per­i­men­tal method, and if so, at which stage of the ex­per­i­men­tal pro­cess is the can­cer cured? Or is it a benefit of sci­ence as a so­cial pro­cess, and if so, does it rely on in­di­vi­d­ual sci­en­tists want­ing to cure can­cer, or can they be self-in­ter­ested?” Hope­fully this leads you away from the good or bad feel­ing, and to­ward notic­ing the con­fu­sion and lack of sup­port.

To sum­ma­rize, you do avoid a Happy Death Spiral by:

  • Split­ting the Great Idea into parts;

  • Treat­ing ev­ery ad­di­tional de­tail as bur­den­some;

  • Think­ing about the speci­fics of the causal chain in­stead of the good or bad feel­ings;

  • Not re­hears­ing ev­i­dence; and

  • Not adding hap­piness from claims that “you can’t prove are wrong”;

but not by:

  • Re­fus­ing to ad­mire any­thing too much;

  • Con­duct­ing a bi­ased search for nega­tive points un­til you feel un­happy again; or

  • Forcibly shov­ing an idea into a safe box.

1Ba­con didn’t sin­gle­hand­edly in­vent sci­ence, of course, but he did con­tribute, and may have been the first to re­al­ize the power.

2Suc­cess­fully, I might add.

3For more back­ground, see “Bur­den­some De­tails,” “How Much Ev­i­dence Does it Take?”, and “Oc­cam’s Ra­zor” in the pre­vi­ous vol­ume, Map and Ter­ri­tory.

4The re­ally dan­ger­ous cases are the ones where any crit­i­cism of any pos­i­tive claim about the Great Thingy feels bad or is so­cially un­ac­cept­able. Ar­gu­ments are sol­diers; any pos­i­tive claim is a sol­dier on our side; stab­bing your sol­diers in the back is trea­son. Then the chain re­ac­tion goes su­per­crit­i­cal. More on this later.

5Source: http://​​less­wrong.com/​​lw/​​lm/​​af­fec­tive_death_spirals/​​gp5.