The honest answer to this question is that it is possible that you’ll get revived into a world that is not worth living in, in which case you can go for suicide.
And then there’s a chance that you get revived into a world where you are in some terrible situation but not allowed to kill yourself. In this case, you have done worse than just dying.
And then there’s a chance that you get revived into a world where you are in some terrible situation but not allowed to kill yourself. In this case, you have done worse than just dying.
That’s a risk for regular death, too, albeit a very unlikely one. This possibility seems like Pascal’s wager with a minus sign.
That said, I am nowhere near certain that bad future awaits us, nor that the above mentioned Malthusian scenario is inevitable. However, it does seem to me as the most plausible course of affairs given a cheap technology for making and copying minds, and it seems reasonable to expect that such technology would follow from more or less the same breakthroughs that would be necessary to revive people from cryonics.
I think that we wouldn’t actually end up in a malthusian regime—we’d coordinate so that that didn’t happen. Especially compelling is the fact that in these regimes of high copy fidelity, you could end up with upload “clans” that acted as one decision-theoretic entity, and would quickly gobble up lone uploads by the power that their cooperation gave them.
The honest answer to this question is that it is possible that you’ll get revived into a world that is not worth living in, in which case you can go for suicide.
And then there’s a chance that you get revived into a world where you are in some terrible situation but not allowed to kill yourself. In this case, you have done worse than just dying.
That’s a risk for regular death, too, albeit a very unlikely one. This possibility seems like Pascal’s wager with a minus sign.
That said, I am nowhere near certain that bad future awaits us, nor that the above mentioned Malthusian scenario is inevitable. However, it does seem to me as the most plausible course of affairs given a cheap technology for making and copying minds, and it seems reasonable to expect that such technology would follow from more or less the same breakthroughs that would be necessary to revive people from cryonics.
I think that we wouldn’t actually end up in a malthusian regime—we’d coordinate so that that didn’t happen. Especially compelling is the fact that in these regimes of high copy fidelity, you could end up with upload “clans” that acted as one decision-theoretic entity, and would quickly gobble up lone uploads by the power that their cooperation gave them.