if you managed to somehow create an identical copy of me then it implies that I should be indifferent to whether I or my copy live. But if that happened, I suspect that both of my instances would prefer to be the ones to live.
I suspect that a short, private conversation with your copy would change your mind. The other thing here is that 1 is far from the ideal number of copies of you—you’d probably be extremely happy to live with other copies of yourself up to a few thousand or something. So going from 2 to 1 is a huge loss to your copy-clan’s utility.
I suspect that a short, private conversation with your copy would change your mind
Can you elaborate how?
E.g. suppose that it was the case that I would get copied, and then one of us would be chosen by lot to be taken in front of a firing squad while the other could continue his life freely. I expect—though of course it’s hard to fully imagine this kind of a hypothetical—that the thought of being taken in front of that firing squad and never seeing any of my loved ones again would create a rather visceral sense of terror in me. Especially if I was given a couple of days for the thought to sink in, and I wouldn’t just be in a sudden shock of “wtf is happening”.
It’s possible that the thought of an identical copy of me being out there in the world would bring some comfort to that, but mostly I don’t see how any conversation would have a chance of significantly nudging those reactions. They seem much too primal and low-level for that.
Here is some FB discussion about some pragmatics of when you could have many clones of yourself. The specific hypothetical setup has the advantage of people thinking about what they would do themselves specifically:
I suspect that a short, private conversation with your copy would change your mind. The other thing here is that 1 is far from the ideal number of copies of you—you’d probably be extremely happy to live with other copies of yourself up to a few thousand or something. So going from 2 to 1 is a huge loss to your copy-clan’s utility.
Can you elaborate how?
E.g. suppose that it was the case that I would get copied, and then one of us would be chosen by lot to be taken in front of a firing squad while the other could continue his life freely. I expect—though of course it’s hard to fully imagine this kind of a hypothetical—that the thought of being taken in front of that firing squad and never seeing any of my loved ones again would create a rather visceral sense of terror in me. Especially if I was given a couple of days for the thought to sink in, and I wouldn’t just be in a sudden shock of “wtf is happening”.
It’s possible that the thought of an identical copy of me being out there in the world would bring some comfort to that, but mostly I don’t see how any conversation would have a chance of significantly nudging those reactions. They seem much too primal and low-level for that.
Here is some FB discussion about some pragmatics of when you could have many clones of yourself. The specific hypothetical setup has the advantage of people thinking about what they would do themselves specifically:
https://www.facebook.com/duncan.sabien/posts/pfbid0Ps7QFKFvCHWSV1MUNWLhegXN9MN8Le4MX6k3PHmhDHLfFsSFNc194TYvH1vWCXzbl