However, this phrase puzzles me:
identical copy immortality seems to imply that we shouldn’t care much about dying, as long as some copies of us live on in those other “places”.
There is a difference between dying as we know it and a reduction in copy quantity.
Some kinds of death (any intentional suicide, for example) would have to also apply to all Independent Identical Copies because they (we?) will all do it at the same time due to the shared algorythm.
Other kinds of death are due to inputs, be they diseases or bullets, wrecking the physical brainware.
The copy-reduction that is also termed death is not something we are capable of noticing. 99⁄100 copies vanishing is not something we can notice, any more than we can notice going from 100 to 200 copies.
The death we care about has nothing to do with copy-reduction other than the termination-of-individual-instance aspect.
So then, it would seem that doubling the amount of copies is not a hedonistic plus because the mind so copied is incapable of noticing this fact.
Back to the original pair of outcomes, A is obviously prefereable because from the perspective of that one copy, there is no effect. When the organisation gets more money, more copies can be instantiated with again no effect from the copied mind’s point of view.
B is effectively a form of Russian Roulette, and not a good idea.
Like other commenters I finish by wondering why the civilization running the simulation is indefferent to the choice.
Certainly, A over B.
However, this phrase puzzles me: identical copy immortality seems to imply that we shouldn’t care much about dying, as long as some copies of us live on in those other “places”.
There is a difference between dying as we know it and a reduction in copy quantity.
Some kinds of death (any intentional suicide, for example) would have to also apply to all Independent Identical Copies because they (we?) will all do it at the same time due to the shared algorythm. Other kinds of death are due to inputs, be they diseases or bullets, wrecking the physical brainware.
The copy-reduction that is also termed death is not something we are capable of noticing. 99⁄100 copies vanishing is not something we can notice, any more than we can notice going from 100 to 200 copies.
The death we care about has nothing to do with copy-reduction other than the termination-of-individual-instance aspect.
So then, it would seem that doubling the amount of copies is not a hedonistic plus because the mind so copied is incapable of noticing this fact.
Back to the original pair of outcomes, A is obviously prefereable because from the perspective of that one copy, there is no effect. When the organisation gets more money, more copies can be instantiated with again no effect from the copied mind’s point of view.
B is effectively a form of Russian Roulette, and not a good idea.
Like other commenters I finish by wondering why the civilization running the simulation is indefferent to the choice.