Circular Altruism

Fol­lowup to: Tor­ture vs. Dust Specks, Zut Allais, Ra­tion­al­ity Quotes 4

Sup­pose that a dis­ease, or a mon­ster, or a war, or some­thing, is kil­ling peo­ple. And sup­pose you only have enough re­sources to im­ple­ment one of the fol­low­ing two op­tions:

  1. Save 400 lives, with cer­tainty.

  2. Save 500 lives, with 90% prob­a­bil­ity; save no lives, 10% prob­a­bil­ity.

Most peo­ple choose op­tion 1. Which, I think, is fool­ish; be­cause if you mul­ti­ply 500 lives by 90% prob­a­bil­ity, you get an ex­pected value of 450 lives, which ex­ceeds the 400-life value of op­tion 1. (Lives saved don’t diminish in marginal util­ity, so this is an ap­pro­pri­ate calcu­la­tion.)

“What!” you cry, in­censed. “How can you gam­ble with hu­man lives? How can you think about num­bers when so much is at stake? What if that 10% prob­a­bil­ity strikes, and ev­ery­one dies? So much for your damned logic! You’re fol­low­ing your ra­tio­nal­ity off a cliff!”

Ah, but here’s the in­ter­est­ing thing. If you pre­sent the op­tions this way:

  1. 100 peo­ple die, with cer­tainty.

  2. 90% chance no one dies; 10% chance 500 peo­ple die.

Then a ma­jor­ity choose op­tion 2. Even though it’s the same gam­ble. You see, just as a cer­tainty of sav­ing 400 lives seems to feel so much more com­fortable than an un­sure gain, so too, a cer­tain loss feels worse than an un­cer­tain one.

You can grand­stand on the sec­ond de­scrip­tion too: “How can you con­demn 100 peo­ple to cer­tain death when there’s such a good chance you can save them? We’ll all share the risk! Even if it was only a 75% chance of sav­ing ev­ery­one, it would still be worth it—so long as there’s a chance—ev­ery­one makes it, or no one does!”

You know what? This isn’t about your feel­ings. A hu­man life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain’s feel­ings of com­fort or dis­com­fort with a plan. Does com­put­ing the ex­pected util­ity feel too cold-blooded for your taste? Well, that feel­ing isn’t even a feather in the scales, when a life is at stake. Just shut up and mul­ti­ply.

Pre­vi­ously on Over­com­ing Bias, I asked what was the least bad, bad thing that could hap­pen, and sug­gested that it was get­ting a dust speck in your eye that ir­ri­tated you for a frac­tion of a sec­ond, barely long enough to no­tice, be­fore it got blinked away. And con­versely, a very bad thing to hap­pen, if not the worst thing, would be get­ting tor­tured for 50 years.

Now, would you rather that a googol­plex peo­ple got dust specks in their eyes, or that one per­son was tor­tured for 50 years? I origi­nally asked this ques­tion with a vastly larger num­ber—an in­com­pre­hen­si­ble math­e­mat­i­cal mag­ni­tude—but a googol­plex works fine for this illus­tra­tion.

Most peo­ple chose the dust specks over the tor­ture. Many were proud of this choice, and in­dig­nant that any­one should choose oth­er­wise: “How dare you con­done tor­ture!”

This matches re­search show­ing that there are “sa­cred val­ues”, like hu­man lives, and “un­sa­cred val­ues”, like money. When you try to trade off a sa­cred value against an un­sa­cred value, sub­jects ex­press great in­dig­na­tion (some­times they want to pun­ish the per­son who made the sug­ges­tion).

My fa­vorite anec­dote along these lines—though my books are packed at the mo­ment, so no cita­tion for now—comes from a team of re­searchers who eval­u­ated the effec­tive­ness of a cer­tain pro­ject, calcu­lat­ing the cost per life saved, and recom­mended to the gov­ern­ment that the pro­ject be im­ple­mented be­cause it was cost-effec­tive. The gov­ern­men­tal agency re­jected the re­port be­cause, they said, you couldn’t put a dol­lar value on hu­man life. After re­ject­ing the re­port, the agency de­cided not to im­ple­ment the mea­sure.

Trad­ing off a sa­cred value (like re­frain­ing from tor­ture) against an un­sa­cred value (like dust specks) feels re­ally awful. To merely mul­ti­ply util­ities would be too cold-blooded—it would be fol­low­ing ra­tio­nal­ity off a cliff...

But let me ask you this. Sup­pose you had to choose be­tween one per­son be­ing tor­tured for 50 years, and a googol peo­ple be­ing tor­tured for 49 years, 364 days, 23 hours, 59 min­utes and 59 sec­onds. You would choose one per­son be­ing tor­tured for 50 years, I do pre­sume; oth­er­wise I give up on you.

And similarly, if you had to choose be­tween a googol peo­ple tor­tured for 49.9999999 years, and a googol-squared peo­ple be­ing tor­tured for 49.9999998 years, you would pick the former.

A googol­plex is ten to the googolth power. That’s a googol/​100 fac­tors of a googol. So we can keep do­ing this, grad­u­ally—very grad­u­ally—diminish­ing the de­gree of dis­com­fort, and mul­ti­ply­ing by a fac­tor of a googol each time, un­til we choose be­tween a googol­plex peo­ple get­ting a dust speck in their eye, and a googol­plex/​googol peo­ple get­ting two dust specks in their eye.

If you find your prefer­ences are cir­cu­lar here, that makes rather a mock­ery of moral grand­stand­ing. If you drive from San Jose to San Fran­cisco to Oak­land to San Jose, over and over again, you may have fun driv­ing, but you aren’t go­ing any­where. Maybe you think it a great dis­play of virtue to choose for a googol­plex peo­ple to get dust specks rather than one per­son be­ing tor­tured. But if you would also trade a googol­plex peo­ple get­ting one dust speck for a googol­plex/​googol peo­ple get­ting two dust specks et cetera, you sure aren’t helping any­one. Cir­cu­lar prefer­ences may work for feel­ing no­ble, but not for feed­ing the hun­gry or heal­ing the sick.

Altru­ism isn’t the warm fuzzy feel­ing you get from be­ing al­tru­is­tic. If you’re do­ing it for the spiritual benefit, that is noth­ing but self­ish­ness. The pri­mary thing is to help oth­ers, what­ever the means. So shut up and mul­ti­ply!

And if it seems to you that there is a fierce­ness to this max­i­miza­tion, like the bare sword of the law, or the burn­ing of the sun—if it seems to you that at the cen­ter of this ra­tio­nal­ity there is a small cold flame -

Well, the other way might feel bet­ter in­side you. But it wouldn’t work.

And I say also this to you: That if you set aside your re­gret for all the spiritual satis­fac­tion you could be hav­ing—if you whole­heart­edly pur­sue the Way, with­out think­ing that you are be­ing cheated—if you give your­self over to ra­tio­nal­ity with­out hold­ing back, you will find that ra­tio­nal­ity gives to you in re­turn.

But that part only works if you don’t go around say­ing to your­self, “It would feel bet­ter in­side me if only I could be less ra­tio­nal.”

Chim­panzees feel, but they don’t mul­ti­ply. Should you be sad that you have the op­por­tu­nity to do bet­ter? You can­not at­tain your full po­ten­tial if you re­gard your gift as a bur­den.

Added: If you’d still take the dust specks, see Un­known’s com­ment on the prob­lem with qual­i­ta­tive ver­sus quan­ti­ta­tive dis­tinc­tions.