Costs to (potentially) eternal life

Imag­ine Omega came to you and said, “Cry­on­ics will work; it will be pos­si­ble for you to be re­s­ur­rected and have the choice be­tween a simu­la­tion and a new healthy body, and I can guaran­tee you live for at least 100,000 years af­ter that. How­ever, for rea­sons I won’t di­vulge, your sur­viv­ing to ex­pe­rience this is wholly con­tin­gent upon you kil­ling the next three peo­ple you see. I can also tell you that the next three peo­ple you see, should you fail to kill them, will die child­less and will never sign up for cry­on­ics. There is a knife on the ground be­hind you.”

You turn around and see some­one. She says, “Wait! You shouldn’t kill me be­cause … “

What does she say that con­vinces you?

[Cry­on­ics] takes the mostly cor­rect idea “life is good, death is bad” to such an ex­treme that it does vi­o­lence to other valuable parts of our hu­man­ity (sorry, but I can’t be more spe­cific).

That’s a quote from a com­ment in a post about cry­on­ics. “I can’t be more spe­cific” is not do­ing this com­ment any fa­vors, and over­all the com­ment was re­but­ted pretty well. But I did try to imag­ine other these other valuable parts, and I re­al­ized some­thing that re­mains un­re­solved for me.

Guaran­teed death places a limit on the value of my life to my­self. Par­ents shield chil­dren with their bod­ies; Casey Jones hap­pens more of­ten. Peo­ple run into burn­ing build­ings more of­ten. (Suicide bombers hap­pen more of­ten, too, I re­al­ize.)

I think this is a valuable part of hu­man­ity, and I think that an ex­treme “life is good, death is bad” view does do vi­o­lence to it. You can ar­gue we should effect a world that makes this will­ing­ness un­nec­es­sary, and I’ll sup­port that; but sep­a­rate from mak­ing the will­ing­ness use­less, elimi­nat­ing that will­ing­ness does vi­o­lence to our hu­man­ity. You can ar­gue that our hu­man­ity is over­rated and there’s some­thing bet­ter over the hori­zon, i.e. the cost is worth it.

But the in­cen­tives for sav­ing 1+X many lives at the cost of your own just got less­ened. How do you put a price on heaven? or­thonor­mal sug­gests that we should rely on hu­man ir­ra­tional­ity here to keep us moral, that thank­fully we are too stupid and slow to ac­tu­ally change the de­ci­sions we make af­ter rec­og­niz­ing the ex­pected value of our op­tions has changed, de­spite the op­por­tu­nity cost of these de­ci­sions grow­ing con­sid­er­ably. I think this a) un­der­es­ti­mates hu­mans’ abil­ity to re­act to in­cen­tives and b) un­der­es­ti­mates the re­ward the uni­verse be­stows on those who do re­act to in­cen­tives.

I don’t see a good “solu­tion” to this prob­lem, other than to rely on cog­ni­tive dis­so­nance to make this not seem as offen­sive as it is now in the fu­ture. The peo­ple for whom this pre­sents a prob­lem will even­tu­ally die out, any­way, as there is a clear ad­van­tage to fa­vor it. I guess that’s the (ul­ti­mately an­ti­cli­mac­tic) take­away: Mo­rals change in the face of progress.

So, which do you fa­vor more—your life, or iden­tity?

EDIT: Well, it looks like this is get­ting fast-tracked for dis­ap­peared sta­tus. I think it’s in­ter­est­ing that peo­ple seem to think I’m mak­ing a state­ment about a moral code. I’m not; I’m talk­ing about in­cen­tives and what would hap­pen, not what the right thing to do is.

Let’s say Eliezer gets his wish and cry­on­ics many, many par­ents sign up for cry­on­ics and sign their chil­dren up for cry­on­ics. Does any­one re­ally ex­pect that this pop­u­la­tion would not re­spond to its in­cen­tives to avoid more dan­ger? Anec­dotes aside; do you ex­pect them to join the mil­i­tary with the same fre­quency, be fire­men with the same fre­quency, to be doc­tors ad­minis­ter­ing vac­ci­na­tions in jun­gles with the same fre­quency? I don’t think it’s pos­si­ble to say that with a straight face and mean it; pop­u­la­tions re­spond to in­cen­tives, and the in­cen­tives just changed for that pop­u­la­tion.