So you’ll end up ‘finding yourself’ in one of the fantastically unlikely worlds where the explosive only maims you.
This is precisely what my example avoids. There are substantially more worlds where you got a 1 and there was no explosion, than worlds where there was an explosion but you somehow managed to survive.
Still, the mere fact that if your reasoning is valid then it must also be true that (as explained above) “you can never really die” constitutes a reductio.
Alternatively, if you want to say that your consciousness really can cease as long as it happens gradually, then how can there be possibly be a principled boundary line between ‘sudden enough that you’ll survive’ and ‘not sudden enough’.
You spoke earlier of making sure that the method of death was faster than most thought processes, so as to avoid ‘committing yourself’ to a world where you die. But where’s the boundary between ‘committing yourself’ and not doing so? Can you “only partially” commit yourself? How would that work?
Nope, it doesn’t. Unfortunately, we don’t need the many worlds hypothesis to run into this trouble. The trouble already exists in this single universe, assuming consciousness is computable. Just replace quantum world splitting with mind copying. Check out the Anthropic Trilemma.
But where’s the boundary between ‘committing yourself’ and not doing so? Can you “only partially” commit yourself?
If I make an exact copy of you, wait X minutes, and then instantly kill one of you, how big must X be before this is murder? Beats me. I suspect there is no hard line.
If I make an exact copy of you, wait X minutes, and then instantly kill one of you, how big must X be before this is murder? Beats me. I suspect there is no hard line.
I would be willing to undergo such a procedure for 10 dollars if X is a minute or less (and you don’t kill me in front of me, no other adverse effects, etc.). If X is 10 minutes, probably about 100 dollars.
Personally I think the third option is ‘obviously correct’. There isn’t really such a thing as a ‘thread of persisting subjective identity’. And this undermines the idea that in the quantum suicide scenario you should ‘expect to become’ the miraculous survivor.
All we can say is that the multiverse contains ‘miraculous observers’ with tiny ‘probability weights’ attached to them—and we can even concede that some of them get round to thinking “hang on—surely this means Many Worlds is true?” But whether their less unlikely counterparts live or die doesn’t affect this in any way.
This is precisely what my example avoids. There are substantially more worlds where you got a 1 and there was no explosion, than worlds where there was an explosion but you somehow managed to survive.
Hmm. OK, you have a point there.
Still, the mere fact that if your reasoning is valid then it must also be true that (as explained above) “you can never really die” constitutes a reductio.
Alternatively, if you want to say that your consciousness really can cease as long as it happens gradually, then how can there be possibly be a principled boundary line between ‘sudden enough that you’ll survive’ and ‘not sudden enough’.
You spoke earlier of making sure that the method of death was faster than most thought processes, so as to avoid ‘committing yourself’ to a world where you die. But where’s the boundary between ‘committing yourself’ and not doing so? Can you “only partially” commit yourself? How would that work?
Doesn’t make sense.
Nope, it doesn’t. Unfortunately, we don’t need the many worlds hypothesis to run into this trouble. The trouble already exists in this single universe, assuming consciousness is computable. Just replace quantum world splitting with mind copying. Check out the Anthropic Trilemma.
If I make an exact copy of you, wait X minutes, and then instantly kill one of you, how big must X be before this is murder? Beats me. I suspect there is no hard line.
I would be willing to undergo such a procedure for 10 dollars if X is a minute or less (and you don’t kill me in front of me, no other adverse effects, etc.). If X is 10 minutes, probably about 100 dollars.
Interesting post!
Personally I think the third option is ‘obviously correct’. There isn’t really such a thing as a ‘thread of persisting subjective identity’. And this undermines the idea that in the quantum suicide scenario you should ‘expect to become’ the miraculous survivor.
All we can say is that the multiverse contains ‘miraculous observers’ with tiny ‘probability weights’ attached to them—and we can even concede that some of them get round to thinking “hang on—surely this means Many Worlds is true?” But whether their less unlikely counterparts live or die doesn’t affect this in any way.