Set-up: In a world that’s otherwise the same, an anthropomorphic God comes to you with a gun and a check, and offers you the following deal, should you accept it. (God can perfectly see the future and is honest.)
Dilemma: God will give you a check of X USD if you accept to be immediately and instantly killed iff the current cryonics protocole is sufficient to preserve identity.
Alternative: God kills you iff the following is true: If you were signed up for cryonics and died this year, you would get reanimated at some point in the future.
Question: What’s the minimum X for which you would accept this deal? What probability of dying are you trading this for?
Comparison:
Say 100,000 USD and 5%. This means this deal would give you:
100% earning 100k USD
5% of dying (right now, instead of ~0%)
If you die and get cryopreserved:
100% losing 100k USD (i.e. a typical cost for state of the art cryonics)
5% of living (instead of 0%)
If you die and don’t get cyropreserved:
100% of saving 100k USD (by not paying for cryonics)
5% more chance of dying (100% instead of 95%)
In this sense, taking the deal is similar to not getting cryopreserved (and vice versa).
Disanalogies:
There are significant disanologies of course:
The type of life you would live
How long you life would be if you didn’t die
The social cost of signing up for cryonics
I still think this might be a useful thought experiment to bring this dilemma to a more near mode way of thinking. It also reverses the default option.
For anyone who thinks that cryonics is Pascal’s mugging, this thought experiment amounts to “if you don’t want to accept Pascal’s mugging, how about I construct an opposite Pascal’s mugging. I gotcha now—if you reject the first Pascal’s mugging, you have to accept this one!” Cryonics is a Pascal’s mugging with a small chance of a large benefit and a large chance of a smaller loss. This version is a small chance of a large harm and a large chance of a smaller benefit.
Pascal’s mugging is a bad deal either way. I won’t accept a very small X in this version merely because I think that the chance of success of cryonics is small, even if the calculation works out that way—the proper response to Pascal’s mugging is to not calculate.
This is a very complicated and hard (for me) to internalize setup, meaning it doesn’t evoke new framings, and doesn’t serve any purpose (again, for me) as a thought experiment. That it’s stated with multiple levels of if and iff makes it harder, but I don’t have any suggestions (except maybe a flowchart or decision matrix) to improve.
It also doesn’t help that it mixes dollars and life, asking for a dollar value against the possibility (even if infinitesimal) of death. Nor that it supposes an omniscient god, which if I had sufficient evidence to believe in the first place SERIOUSLY changes my expectations and probabilities.
And my underlying probability IS infinitesimal that I will die in a way that current cryopreservation systems (tech, legal, and social) will actually lead to my future resurrection.
I like the flipping attempt—changing framing around is a good way to triangulate intuitions. I think there are some things that could be strengthened about it: 1. The “alt” question is better than the 1st, but should include “and you won’t be reanimated otherwise”, or in other words: God kills you iff (you accept the deal and) being signed up for cryonics before you get significant new information will make the difference between you being re-animated or not. 2. Several reason jump to my mind why I wouldn’t trade the rest of my years of life now for a 100% guarantee of n+1 years in the future: My family and friends will be sad now that I am not around, I will be sad later that they are not around, I know my quality of life now and I don’t know what it would be after reanimation, I am attempting effective altruism now and believe I will be far less effective in the future, etc.
So, maybe “if being signed up for cryonics now make the difference in getting you reanimated and at least as much total happiness in the future, God will replace you with an unconscious replica that will do exactly what you would have done”
A different way to flip the question: Let’s say you wanted to stay dead. Either being reanimated would pull you back down to earth, or you have some powerful enemies in the future or for whatever reason, it’s roughly as important to you that you stay dead as it currently is that you don’t. If you do nothing, you will be cryopreserved upon death. How much do you pay to avoid that?
I’m not sure I understand the “new information” part; maybe you’re just saying as your current probability. But yeah, I agree the probability on “I will be reanimated” is better as a good preservation is useless if the world will never reanimate you. However, I find it really strange to throw out a perfectly good brain in the trash because you think future will never reanimate you; I’m way more inclined to take a decision based on the quality of brain preservation and not being so confident about the longterm future to rule out reanimation capability and willingness.
My thought experiment isn’t about trading those, but maybe you’re saying the analogy is imperfect because living today is more valuable then living in the future, which is a coherent position yeah.
So, maybe “if being signed up for cryonics now make the difference in getting you reanimated and at least as much total happiness in the future, God will replace you with an unconscious replica that will do exactly what you would have done”
I’m not following. Are you saying cryonics revival “is just a copy”? And so the analogy should preserve that?
Let’s say you wanted to stay dead. Either being reanimated would pull you back down to earth, or you have some powerful enemies in the future or for whatever reason, it’s roughly as important to you that you stay dead as it currently is that you don’t. If you do nothing, you will be cryopreserved upon death. How much do you pay to avoid that?
If I understand correctly, you’re saying “what if being reanimated was bad instead of good, then how much would you pay to not be reanimated?”. The answer is probably “depends how bad” 🤷♂️ I don’t know if I’m missing something deeper you meant to ask though.
Sure, happy to clarify: 1. The “before new info” means that it would feel unfair if you took the deal and then God was like “well, I gotta kill you because in 6 months they’re going to have a breakthrough and do the first successful human reanimation”. You’d be like “well, then I would just have signed up in 6 months when I found out. So unless I would have died in the next 6 months, you shouldn’t kill me”. Alternatively the gamble could be that God kills you if you wouldn’t have ended up signing up for cryonics by the time you die and it would have worked. 2. Well yeah, I’m assuming that the point of your analogy is to construct it so that the hypothetical decision you make tells you what your actual decision should be on cryonics. If it’s just a whimsical thought experiment then there’s no need to match everything up. If it is intended to mean that someone who would require more than the current cost of cryonics to take the deal should sign up for cryonics, then it does have to match up stuff, because it is entirely coherent for someone to, for example, neither want to be revived into a dystopia nor be killed immediately.
The unconscious replica is intended to keep the impact on others the same. No guilt about traumatizing your children, for example, because they would still grow up with 2 loving parents. So you’re just worried about whether you want money or life and not moral duties you might have to avoid being killed prematurely.
Yeah, you understood my example. It’s not particularly deep. It’s just that I find that many people have a pessimism bias, so I can feel myself thinking “cryonics probably won’t work” but if I imagine someone evil wants to revive and hurt me I think “but there’s a chance it would work...”. For the “depends how bad”, I think the 2 ways one can use the idea are a) set that it’s exactly as much worse than death as you expect reanimated life would be better than death, or b) just play with different severities and see if your gut estimate of the probability of revival changes.
[Question] Near-mode cryonics: A thought experiment
(x-post: LessDead)
Set-up: In a world that’s otherwise the same, an anthropomorphic God comes to you with a gun and a check, and offers you the following deal, should you accept it. (God can perfectly see the future and is honest.)
Dilemma: God will give you a check of X USD if you accept to be immediately and instantly killed iff the current cryonics protocole is sufficient to preserve identity.
Alternative: God kills you iff the following is true: If you were signed up for cryonics and died this year, you would get reanimated at some point in the future.
Question: What’s the minimum X for which you would accept this deal? What probability of dying are you trading this for?
Comparison:
Say 100,000 USD and 5%. This means this deal would give you:
100% earning 100k USD
5% of dying (right now, instead of ~0%)
If you die and get cryopreserved:
100% losing 100k USD (i.e. a typical cost for state of the art cryonics)
5% of living (instead of 0%)
If you die and don’t get cyropreserved:
100% of saving 100k USD (by not paying for cryonics)
5% more chance of dying (100% instead of 95%)
In this sense, taking the deal is similar to not getting cryopreserved (and vice versa).
Disanalogies:
There are significant disanologies of course:
The type of life you would live
How long you life would be if you didn’t die
The social cost of signing up for cryonics
I still think this might be a useful thought experiment to bring this dilemma to a more near mode way of thinking. It also reverses the default option.
For anyone who thinks that cryonics is Pascal’s mugging, this thought experiment amounts to “if you don’t want to accept Pascal’s mugging, how about I construct an opposite Pascal’s mugging. I gotcha now—if you reject the first Pascal’s mugging, you have to accept this one!” Cryonics is a Pascal’s mugging with a small chance of a large benefit and a large chance of a smaller loss. This version is a small chance of a large harm and a large chance of a smaller benefit.
Pascal’s mugging is a bad deal either way. I won’t accept a very small X in this version merely because I think that the chance of success of cryonics is small, even if the calculation works out that way—the proper response to Pascal’s mugging is to not calculate.
This is a very complicated and hard (for me) to internalize setup, meaning it doesn’t evoke new framings, and doesn’t serve any purpose (again, for me) as a thought experiment. That it’s stated with multiple levels of if and iff makes it harder, but I don’t have any suggestions (except maybe a flowchart or decision matrix) to improve.
It also doesn’t help that it mixes dollars and life, asking for a dollar value against the possibility (even if infinitesimal) of death. Nor that it supposes an omniscient god, which if I had sufficient evidence to believe in the first place SERIOUSLY changes my expectations and probabilities.
And my underlying probability IS infinitesimal that I will die in a way that current cryopreservation systems (tech, legal, and social) will actually lead to my future resurrection.
I like the flipping attempt—changing framing around is a good way to triangulate intuitions. I think there are some things that could be strengthened about it:
1. The “alt” question is better than the 1st, but should include “and you won’t be reanimated otherwise”, or in other words: God kills you iff (you accept the deal and) being signed up for cryonics before you get significant new information will make the difference between you being re-animated or not.
2. Several reason jump to my mind why I wouldn’t trade the rest of my years of life now for a 100% guarantee of n+1 years in the future: My family and friends will be sad now that I am not around, I will be sad later that they are not around, I know my quality of life now and I don’t know what it would be after reanimation, I am attempting effective altruism now and believe I will be far less effective in the future, etc.
So, maybe “if being signed up for cryonics now make the difference in getting you reanimated and at least as much total happiness in the future, God will replace you with an unconscious replica that will do exactly what you would have done”
A different way to flip the question:
Let’s say you wanted to stay dead. Either being reanimated would pull you back down to earth, or you have some powerful enemies in the future or for whatever reason, it’s roughly as important to you that you stay dead as it currently is that you don’t. If you do nothing, you will be cryopreserved upon death. How much do you pay to avoid that?
Thanks for engaging!
I’m not sure I understand the “new information” part; maybe you’re just saying as your current probability. But yeah, I agree the probability on “I will be reanimated” is better as a good preservation is useless if the world will never reanimate you. However, I find it really strange to throw out a perfectly good brain in the trash because you think future will never reanimate you; I’m way more inclined to take a decision based on the quality of brain preservation and not being so confident about the longterm future to rule out reanimation capability and willingness.
My thought experiment isn’t about trading those, but maybe you’re saying the analogy is imperfect because living today is more valuable then living in the future, which is a coherent position yeah.
I’m not following. Are you saying cryonics revival “is just a copy”? And so the analogy should preserve that?
If I understand correctly, you’re saying “what if being reanimated was bad instead of good, then how much would you pay to not be reanimated?”. The answer is probably “depends how bad” 🤷♂️ I don’t know if I’m missing something deeper you meant to ask though.
Sure, happy to clarify:
1. The “before new info” means that it would feel unfair if you took the deal and then God was like “well, I gotta kill you because in 6 months they’re going to have a breakthrough and do the first successful human reanimation”. You’d be like “well, then I would just have signed up in 6 months when I found out. So unless I would have died in the next 6 months, you shouldn’t kill me”. Alternatively the gamble could be that God kills you if you wouldn’t have ended up signing up for cryonics by the time you die and it would have worked.
2. Well yeah, I’m assuming that the point of your analogy is to construct it so that the hypothetical decision you make tells you what your actual decision should be on cryonics. If it’s just a whimsical thought experiment then there’s no need to match everything up. If it is intended to mean that someone who would require more than the current cost of cryonics to take the deal should sign up for cryonics, then it does have to match up stuff, because it is entirely coherent for someone to, for example, neither want to be revived into a dystopia nor be killed immediately.
The unconscious replica is intended to keep the impact on others the same. No guilt about traumatizing your children, for example, because they would still grow up with 2 loving parents. So you’re just worried about whether you want money or life and not moral duties you might have to avoid being killed prematurely.
Yeah, you understood my example. It’s not particularly deep. It’s just that I find that many people have a pessimism bias, so I can feel myself thinking “cryonics probably won’t work” but if I imagine someone evil wants to revive and hurt me I think “but there’s a chance it would work...”. For the “depends how bad”, I think the 2 ways one can use the idea are a) set that it’s exactly as much worse than death as you expect reanimated life would be better than death, or b) just play with different severities and see if your gut estimate of the probability of revival changes.