One way to avoid the absurd conclusion is to say that it doesn’t matter if another mind is you.
Suppose I have a utility function over the entire quantum wave function. This utility function is mostly focused on beings that are similar to myself. So I consider the alternate me, that differs only in phone number, getting £100, about equal to the original me getting £100. As far as my utility function goes, both the versions of me would just be made worse off by forgetting the number.
Most human preferences has an embedded idea of identity as a receiver of the profit. However, the idea of “beings similar to me” assumes that there are “beings which are not enough similar to me for to be regarded as me”—but still have some of my traits. In other words, any definition of identity creates possibility of “pseudo-copies”: if we define the identity wider, the circle of the pseudo-copies around it will become also wider, but will not disappear until we include all possible beings and end up with open individualism.
If we assume total “open individualism”, it results in perfect effective altruism and the utility function will be akin “I prefer that total wellbeing of all sentient beings in the universe will increase on 100 pounds”. However, this is not how most human preferences work, and there is also a risk of starvation.
So playing with the definition of identity will not help to escape the problem of existence of pseudo-copies, which could become “real me”, if some information is erased from both of us.
One way to avoid the absurd conclusion is to say that it doesn’t matter if another mind is you.
Suppose I have a utility function over the entire quantum wave function. This utility function is mostly focused on beings that are similar to myself. So I consider the alternate me, that differs only in phone number, getting £100, about equal to the original me getting £100. As far as my utility function goes, both the versions of me would just be made worse off by forgetting the number.
Most human preferences has an embedded idea of identity as a receiver of the profit. However, the idea of “beings similar to me” assumes that there are “beings which are not enough similar to me for to be regarded as me”—but still have some of my traits. In other words, any definition of identity creates possibility of “pseudo-copies”: if we define the identity wider, the circle of the pseudo-copies around it will become also wider, but will not disappear until we include all possible beings and end up with open individualism.
If we assume total “open individualism”, it results in perfect effective altruism and the utility function will be akin “I prefer that total wellbeing of all sentient beings in the universe will increase on 100 pounds”. However, this is not how most human preferences work, and there is also a risk of starvation.
So playing with the definition of identity will not help to escape the problem of existence of pseudo-copies, which could become “real me”, if some information is erased from both of us.