You Only Live Twice

“It just so hap­pens that your friend here is only mostly dead. There’s a big differ­ence be­tween mostly dead and all dead.”
-- The Princess Bride

My co-blog­ger Robin and I may dis­agree on how fast an AI can im­prove it­self, but we agree on an is­sue that seems much sim­pler to us than that: At the point where the cur­rent le­gal and med­i­cal sys­tem gives up on a pa­tient, they aren’t re­ally dead.

Robin has already said much of what needs say­ing, but a few more points:

Ben Best’s Cry­on­ics FAQ, Al­cor’s FAQ, Al­cor FAQ for sci­en­tists, Scien­tists’ Open Let­ter on Cryonics

• I know more peo­ple who are plan­ning to sign up for cry­on­ics Real Soon Now than peo­ple who have ac­tu­ally signed up. I ex­pect that more peo­ple have died while cry­ocras­ti­nat­ing than have ac­tu­ally been cry­op­re­served. If you’ve already de­cided this is a good idea, but you “haven’t got­ten around to it”, sign up for cry­on­ics NOW. I mean RIGHT NOW. Go to the web­site of Al­cor or the Cry­on­ics In­sti­tute and fol­low the in­struc­tions.

• Cry­on­ics is usu­ally funded through life in­surance. The fol­low­ing con­ver­sa­tion from an Over­com­ing Bias meetup is worth quot­ing:

Him: I’ve been think­ing about sign­ing up for cry­on­ics when I’ve got enough money.

Me: Um… it doesn’t take all that much money.

Him: It doesn’t?

Me: Al­cor is the high-priced high-qual­ity or­ga­ni­za­tion, which is some­thing like $500-$1000 in an­nual fees for the or­ga­ni­za­tion, I’m not sure how much. I’m young, so I’m signed up with the Cry­on­ics In­sti­tute, which is $120/​year for the mem­ber­ship. I pay $180/​year for more in­surance than I need—it’d be enough for Al­cor too.

Him: That’s ridicu­lous.

Me: Yes.

Him: No, re­ally, that’s ridicu­lous. If that’s true then my de­ci­sion isn’t just de­ter­mined, it’s overde­ter­mined.

Me: Yes. And there’s around a thou­sand peo­ple wor­ld­wide [ac­tu­ally 1400] who are signed up for cry­on­ics. Figure that at most a quar­ter of those did it for sys­tem­at­i­cally ra­tio­nal rea­sons. That’s a high up­per bound on the num­ber of peo­ple on Earth who can re­li­ably reach the right con­clu­sion on mas­sively overde­ter­mined is­sues.

• Cry­on­ics is not mar­keted well—or at all, re­ally. There’s no sales­peo­ple who get com­mis­sions. There is no one to hold your hand through sign­ing up, so you’re go­ing to have to get the pa­pers signed and no­ta­rized your­self. The clos­est thing out there might be Rudi Hoff­man, who sells life in­surance with cry­on­ics-friendly in­surance providers (I went through him).

• If you want to se­curely erase a hard drive, it’s not as easy as writ­ing it over with ze­roes. Sure, an “erased” hard drive like this won’t boot up your com­puter if you just plug it in again. But if the drive falls into the hands of a spe­cial­ist with a scan­ning tun­nel­ing micro­scope, they can tell the differ­ence be­tween “this was a 0, over­writ­ten by a 0″ and “this was a 1, over­writ­ten by a 0”.

There are pro­grams ad­ver­tised to “se­curely erase” hard drives us­ing many over­writes of 0s, 1s, and ran­dom data. But if you want to keep the se­cret on your hard drive se­cure against all pos­si­ble fu­ture tech­nolo­gies that might ever be de­vel­oped, then cover it with ther­mite and set it on fire. It’s the only way to be sure.

Pump­ing some­one full of cry­opro­tec­tant and grad­u­ally low­er­ing their tem­per­a­ture un­til they can be stored in liquid ni­tro­gen is not a se­cure way to erase a per­son.

See also the in­for­ma­tion-the­o­retic crite­rion of death.

• You don’t have to buy what’s usu­ally called the “pat­ternist” philos­o­phy of iden­tity, to sign up for cry­on­ics. After read­ing all the in­for­ma­tion off the brain, you could put the “same atoms” back into their old places.

• “Same atoms” is in scare quotes be­cause our cur­rent physics pro­hibits par­ti­cles from pos­sess­ing in­di­vi­d­ual iden­tities. It’s a much stronger state­ment than “we can’t tell the par­ti­cles apart with cur­rent mea­sure­ments” and has to do with the no­tion of con­figu­ra­tion spaces in quan­tum me­chan­ics. This is a stan­dard idea in QM, not an un­usual woo-woo one—see this se­quence on Over­com­ing Bias for a gen­tle in­tro­duc­tion. Although pat­ternism is not nec­es­sary to the cry­on­ics the­sis, we hap­pen to live in a uni­verse where “the same atoms” is phys­i­cal non­sense.

There’s a num­ber of in­tu­itions we have in our brains for pro­cess­ing a world of dis­tinct phys­i­cal ob­jects, built in from a very young age. Th­ese in­tu­itions, which may say things like “If an ob­ject dis­ap­pears, and then comes back, it isn’t the same ob­ject”, are tuned to our macro­scopic world and gen­er­ally don’t match up well with fun­da­men­tal physics. Your iden­tity is not like a lit­tle billiard ball that fol­lows you around—there aren’t ac­tu­ally any billiard balls down there.

Separately and con­ver­gently, more ab­stract rea­son­ing strongly sug­gests that “iden­tity” should not be epiphe­nom­e­nal; that is, you should not be able to change some­one’s iden­tity with­out chang­ing any ob­serv­able fact about them.

If you go through the afore­men­tioned Over­com­ing Bias se­quence, you should ac­tu­ally be able to see in­tu­itively that suc­cess­ful cry­on­ics pre­serves any­thing about you that is pre­served by go­ing to sleep at night and wak­ing up the next morn­ing.

• Cry­on­ics, to me, makes two state­ments.

The first state­ment is about sys­tem­at­i­cally valu­ing hu­man life. It’s bad when a pretty young white girl goes miss­ing some­where in Amer­ica. But when 800,000 Afri­cans get mur­dered in Rwanda, that gets 1134 the me­dia cov­er­age of the Michael Jack­son trial. It’s sad, to be sure, but no cause for emo­tional alarm. When brown peo­ple die, that’s all part of the plan—as a smil­ing man once said.

Cry­on­i­cists are peo­ple who’ve de­cided that their deaths, and the deaths of their friends and fam­ily and the rest of the hu­man species, are not part of the plan.

I’ve met one or two Ran­dian-type “self­ish” cry­on­i­cists, but they aren’t a ma­jor­ity. Most peo­ple who sign up for cry­on­ics wish that ev­ery­one would sign up for cry­on­ics.

The sec­ond state­ment is that you have at least a lit­tle hope in the fu­ture. Not faith, not blind hope, not ir­ra­tional hope—just, any hope at all.

I was once at a table with Ralph Merkle, talk­ing about how to mar­ket cry­on­ics if any­one ever gets around to mar­ket­ing it, and Ralph sug­gested a group of peo­ple in a restau­rant, hav­ing a party; and the cam­era pulls back, and moves out­side the win­dow, and the restau­rant is on the Moon. Tagline: “Wouldn’t you want to be there?”

If you look back at, say, the Mid­dle Ages, things were worse then. I’d rather live here then there. I have hope that hu­man­ity will move for­ward fur­ther, and that’s some­thing that I want to see.

And I hope that the idea that peo­ple are dis­pos­able, and that their deaths are part of the plan, is some­thing that fades out of the Fu­ture.

Once upon a time, in­fant deaths were part of the plan, and now they’re not. Once upon a time, slav­ery was part of the plan, and now it’s not. Once upon a time, dy­ing at thirty was part of the plan, and now it’s not. That’s a psy­cholog­i­cal shift, not just an in­crease in liv­ing stan­dards. Our era doesn’t value hu­man life with perfect con­sis­tency—but the value of hu­man life is higher than it once was.

We have a con­cept of what a me­dieval peas­ant should have had, the dig­nity with which they should have been treated, that is higher than what they would have thought to ask for them­selves.

If no one in the fu­ture cares enough to save peo­ple who can be saved… well. In cry­on­ics there is an el­e­ment of tak­ing re­spon­si­bil­ity for the Fu­ture. You may be around to reap what your era has sown. It is not just my hope that the Fu­ture be a bet­ter place; it is my re­spon­si­bil­ity. If I thought that we were on track to a Fu­ture where no one cares about hu­man life, and lives that could eas­ily be saved are just thrown away—then I would try to change that. Not ev­ery­thing worth do­ing is easy.

Not sign­ing up for cry­on­ics—what does that say? That you’ve lost hope in the fu­ture. That you’ve lost your will to live. That you’ve stopped be­liev­ing that hu­man life, and your own life, is some­thing of value.

This can be a painful world we live in, and the me­dia is always tel­ling us how much worse it will get. If you spend enough time not look­ing for­ward to the next day, it dam­ages you, af­ter a while. You lose your abil­ity to hope. Try tel­ling some­one already grown old to sign up for cry­on­ics, and they’ll tell you that they don’t want to be old for­ever—that they’re tired. If you try to ex­plain to some­one already grown old, that the nan­otech­nol­ogy to re­vive a cry­on­ics pa­tient is suffi­ciently ad­vanced that re­vers­ing ag­ing is al­most triv­ial by com­par­i­son… then it’s not some­thing they can imag­ine on an emo­tional level, no mat­ter what they be­lieve or don’t be­lieve about fu­ture tech­nol­ogy. They can’t imag­ine not be­ing tired. I think that’s true of a lot of peo­ple in this world. If you’ve been hurt enough, you can no longer imag­ine heal­ing.

But things re­ally were a lot worse in the Mid­dle Ages. And they re­ally are a lot bet­ter now. Maybe hu­man­ity isn’t doomed. The Fu­ture could be some­thing that’s worth see­ing, worth liv­ing in. And it may have a con­cept of sen­tient dig­nity that val­ues your life more than you dare to value your­self.

On be­half of the Fu­ture, then—please ask for a lit­tle more for your­self. More than death. It re­ally… isn’t be­ing self­ish. I want you to live. I think that the Fu­ture will want you to live. That if you let your­self die, peo­ple who aren’t even born yet will be sad for the ir­re­place­able thing that was lost.

So please, live.

My brother didn’t. My grand­par­ents won’t. But ev­ery­thing we can hold back from the Reaper, even a sin­gle life, is pre­cious.

If other peo­ple want you to live, then it’s not just you do­ing some­thing self­ish and un­for­giv­able, right?

So I’m say­ing it to you.

I want you to live.