Cryonics Questions

Cry­on­ics fills many with dis­gust, a cog­ni­tively dan­ger­ous emo­tion. To test whether a few of your pos­si­ble cry­on­ics ob­jec­tions are rea­son or dis­gust based, I list six non-cry­on­ics ques­tions. An­swer­ing yes to any one ques­tion in­di­cates that ra­tio­nally you shouldn’t have the cor­re­spond­ing cry­on­ics ob­jec­tions.

1. You have a dis­ease and will soon die un­less you get an op­er­a­tion. With the op­er­a­tion you have a non-triv­ial but far from cer­tain chance of liv­ing a long, healthy life. By some crazy co­in­ci­dence the op­er­a­tion costs ex­actly as much as cry­on­ics does and the only hos­pi­tals ca­pa­ble of perform­ing the op­er­a­tion are next to cry­on­ics fa­cil­ities. Do you get the op­er­a­tion?

An­swer­ing yes to (1) means you shouldn’t ob­ject to cry­on­ics be­cause of costs or lo­gis­tics.

2. You have the same dis­ease as in (1), but now the op­er­a­tion costs far more than you could ever ob­tain. For­tu­nately, you have ex­actly the right qual­ifi­ca­tions NASA is look­ing for in a space ship com­man­der. NASA will pay for the op­er­a­tion if in re­turn you cap­tain the ship should you sur­vive the op­er­a­tion. The ship will travel close to the speed of light. The trip will sub­jec­tively take you a year, but when you re­turn one hun­dred years will have passed on Earth. Do you get the op­er­a­tion?

An­swer­ing yes to (2) means you shouldn’t ob­ject to cry­on­ics be­cause of the pos­si­bil­ity of wak­ing up in the far fu­ture.

3. Were you al­ive 20 years ago?

An­swer­ing yes to (3) means you have a rel­a­tively loose defi­ni­tion of what con­sti­tutes “you” and so you shouldn’t ob­ject to cry­on­ics be­cause you fear that the thing that would be re­vived wouldn’t be you.

4. Do you be­lieve that there is a rea­son­able chance that a friendly sin­gu­lar­ity will oc­cur this cen­tury?

An­swer­ing yes to (4) means you should think it pos­si­ble that some­one cryo­geni­cally pre­served would be re­vived this cen­tury. A friendly sin­gu­lar­ity would likely pro­duce an AI that in one sec­ond could think all the thoughts that would take a billion sci­en­tists a billion years to con­tem­plate. Given that bac­te­ria seem to have mas­tered nan­otech­nol­ogy, it’s hard to imag­ine that a billion sci­en­tists work­ing for a billion years wouldn’t have a rea­son­able chance of mas­ter­ing it. Also, a friendly post-sin­gu­lar­ity AI would likely have enough re­spect for hu­man life so that it would be will­ing to re­vive.

5. You some­how know that a sin­gu­lar­ity-caus­ing in­tel­li­gence ex­plo­sion will oc­cur to­mor­row. You also know that the build­ing you are cur­rently in is on fire. You pull an alarm and ob­serve ev­ery­one else safely leav­ing the build­ing. You re­al­ize that if you don’t leave you will fall un­con­scious, painlessly die, and have your brain in­cin­er­ated. Do you leave the build­ing?

An­swer­ing yes to (5) means you prob­a­bly shouldn’t ab­stain from cry­on­ics be­cause you fear be­ing re­vived and then tor­tured.

6. One minute from now a man pushes you to the ground, pulls out a long sword, presses the sword’s tip to your throat, and pledges to kill you. You have one small chance at sur­vival: grab the sword’s sharp blade, thrust it away and then run. But even with your best efforts you will still prob­a­bly die. Do you fight against death?

An­swer­ing yes to (6) means you can’t pre­tend that you don’t value your life enough to sign up for cry­on­ics.

If you an­swered yes to all six ques­tions and have not and do not in­tend to sign up for cry­on­ics please give your rea­sons in the com­ments. What other ques­tions can you think of that provide a non-cry­on­ics way of get­ting at cry­on­ics ob­jec­tions?