I like the flipping attempt—changing framing around is a good way to triangulate intuitions. I think there are some things that could be strengthened about it: 1. The “alt” question is better than the 1st, but should include “and you won’t be reanimated otherwise”, or in other words: God kills you iff (you accept the deal and) being signed up for cryonics before you get significant new information will make the difference between you being re-animated or not. 2. Several reason jump to my mind why I wouldn’t trade the rest of my years of life now for a 100% guarantee of n+1 years in the future: My family and friends will be sad now that I am not around, I will be sad later that they are not around, I know my quality of life now and I don’t know what it would be after reanimation, I am attempting effective altruism now and believe I will be far less effective in the future, etc.
So, maybe “if being signed up for cryonics now make the difference in getting you reanimated and at least as much total happiness in the future, God will replace you with an unconscious replica that will do exactly what you would have done”
A different way to flip the question: Let’s say you wanted to stay dead. Either being reanimated would pull you back down to earth, or you have some powerful enemies in the future or for whatever reason, it’s roughly as important to you that you stay dead as it currently is that you don’t. If you do nothing, you will be cryopreserved upon death. How much do you pay to avoid that?
I’m not sure I understand the “new information” part; maybe you’re just saying as your current probability. But yeah, I agree the probability on “I will be reanimated” is better as a good preservation is useless if the world will never reanimate you. However, I find it really strange to throw out a perfectly good brain in the trash because you think future will never reanimate you; I’m way more inclined to take a decision based on the quality of brain preservation and not being so confident about the longterm future to rule out reanimation capability and willingness.
My thought experiment isn’t about trading those, but maybe you’re saying the analogy is imperfect because living today is more valuable then living in the future, which is a coherent position yeah.
So, maybe “if being signed up for cryonics now make the difference in getting you reanimated and at least as much total happiness in the future, God will replace you with an unconscious replica that will do exactly what you would have done”
I’m not following. Are you saying cryonics revival “is just a copy”? And so the analogy should preserve that?
Let’s say you wanted to stay dead. Either being reanimated would pull you back down to earth, or you have some powerful enemies in the future or for whatever reason, it’s roughly as important to you that you stay dead as it currently is that you don’t. If you do nothing, you will be cryopreserved upon death. How much do you pay to avoid that?
If I understand correctly, you’re saying “what if being reanimated was bad instead of good, then how much would you pay to not be reanimated?”. The answer is probably “depends how bad” 🤷♂️ I don’t know if I’m missing something deeper you meant to ask though.
Sure, happy to clarify: 1. The “before new info” means that it would feel unfair if you took the deal and then God was like “well, I gotta kill you because in 6 months they’re going to have a breakthrough and do the first successful human reanimation”. You’d be like “well, then I would just have signed up in 6 months when I found out. So unless I would have died in the next 6 months, you shouldn’t kill me”. Alternatively the gamble could be that God kills you if you wouldn’t have ended up signing up for cryonics by the time you die and it would have worked. 2. Well yeah, I’m assuming that the point of your analogy is to construct it so that the hypothetical decision you make tells you what your actual decision should be on cryonics. If it’s just a whimsical thought experiment then there’s no need to match everything up. If it is intended to mean that someone who would require more than the current cost of cryonics to take the deal should sign up for cryonics, then it does have to match up stuff, because it is entirely coherent for someone to, for example, neither want to be revived into a dystopia nor be killed immediately.
The unconscious replica is intended to keep the impact on others the same. No guilt about traumatizing your children, for example, because they would still grow up with 2 loving parents. So you’re just worried about whether you want money or life and not moral duties you might have to avoid being killed prematurely.
Yeah, you understood my example. It’s not particularly deep. It’s just that I find that many people have a pessimism bias, so I can feel myself thinking “cryonics probably won’t work” but if I imagine someone evil wants to revive and hurt me I think “but there’s a chance it would work...”. For the “depends how bad”, I think the 2 ways one can use the idea are a) set that it’s exactly as much worse than death as you expect reanimated life would be better than death, or b) just play with different severities and see if your gut estimate of the probability of revival changes.
I like the flipping attempt—changing framing around is a good way to triangulate intuitions. I think there are some things that could be strengthened about it:
1. The “alt” question is better than the 1st, but should include “and you won’t be reanimated otherwise”, or in other words: God kills you iff (you accept the deal and) being signed up for cryonics before you get significant new information will make the difference between you being re-animated or not.
2. Several reason jump to my mind why I wouldn’t trade the rest of my years of life now for a 100% guarantee of n+1 years in the future: My family and friends will be sad now that I am not around, I will be sad later that they are not around, I know my quality of life now and I don’t know what it would be after reanimation, I am attempting effective altruism now and believe I will be far less effective in the future, etc.
So, maybe “if being signed up for cryonics now make the difference in getting you reanimated and at least as much total happiness in the future, God will replace you with an unconscious replica that will do exactly what you would have done”
A different way to flip the question:
Let’s say you wanted to stay dead. Either being reanimated would pull you back down to earth, or you have some powerful enemies in the future or for whatever reason, it’s roughly as important to you that you stay dead as it currently is that you don’t. If you do nothing, you will be cryopreserved upon death. How much do you pay to avoid that?
Thanks for engaging!
I’m not sure I understand the “new information” part; maybe you’re just saying as your current probability. But yeah, I agree the probability on “I will be reanimated” is better as a good preservation is useless if the world will never reanimate you. However, I find it really strange to throw out a perfectly good brain in the trash because you think future will never reanimate you; I’m way more inclined to take a decision based on the quality of brain preservation and not being so confident about the longterm future to rule out reanimation capability and willingness.
My thought experiment isn’t about trading those, but maybe you’re saying the analogy is imperfect because living today is more valuable then living in the future, which is a coherent position yeah.
I’m not following. Are you saying cryonics revival “is just a copy”? And so the analogy should preserve that?
If I understand correctly, you’re saying “what if being reanimated was bad instead of good, then how much would you pay to not be reanimated?”. The answer is probably “depends how bad” 🤷♂️ I don’t know if I’m missing something deeper you meant to ask though.
Sure, happy to clarify:
1. The “before new info” means that it would feel unfair if you took the deal and then God was like “well, I gotta kill you because in 6 months they’re going to have a breakthrough and do the first successful human reanimation”. You’d be like “well, then I would just have signed up in 6 months when I found out. So unless I would have died in the next 6 months, you shouldn’t kill me”. Alternatively the gamble could be that God kills you if you wouldn’t have ended up signing up for cryonics by the time you die and it would have worked.
2. Well yeah, I’m assuming that the point of your analogy is to construct it so that the hypothetical decision you make tells you what your actual decision should be on cryonics. If it’s just a whimsical thought experiment then there’s no need to match everything up. If it is intended to mean that someone who would require more than the current cost of cryonics to take the deal should sign up for cryonics, then it does have to match up stuff, because it is entirely coherent for someone to, for example, neither want to be revived into a dystopia nor be killed immediately.
The unconscious replica is intended to keep the impact on others the same. No guilt about traumatizing your children, for example, because they would still grow up with 2 loving parents. So you’re just worried about whether you want money or life and not moral duties you might have to avoid being killed prematurely.
Yeah, you understood my example. It’s not particularly deep. It’s just that I find that many people have a pessimism bias, so I can feel myself thinking “cryonics probably won’t work” but if I imagine someone evil wants to revive and hurt me I think “but there’s a chance it would work...”. For the “depends how bad”, I think the 2 ways one can use the idea are a) set that it’s exactly as much worse than death as you expect reanimated life would be better than death, or b) just play with different severities and see if your gut estimate of the probability of revival changes.