This looks like an argument for extreme time preference, not an argument against copies. Why identify with one million-years-later version of yourself and exclude the other, unless we beg the question?
That’s what I’m saying. I myself wouldn’t identify with any of the copies, no mater how near or distant. My clone and I have a lot in common, but were are separate sentient beings (hence: requesting suicide of the other is tantamount to murder). But if you do identify with clones (as in: they are you, not merely other beings that are similar to you), then at some point you and they must cross the line of divergence where they no longer are identifiable, or else the argument reduces to absurdity. Where is that line? I see no non-arbitrary way of defining it.
EDIT: which led me to suspect that other than intuition I have no reason to think that my clone and I share the same identity, which led me to consider other models for consciousness and identity. My terseness isn’t just because of the moral repugnance of asking others to suicide, but also because this is an old, already hashed argument. I first encountered it in philosophy class 10+ years ago. If there is a formal response to the reduction to absurdity I gave (which doesn’t also throw out consciousness entirely), I have yet to see it.
Maybe you already got this part, but time preference is orthogonal to copies vs originals.
Eliezer says he defines personal identity in part by causal connections, which exist between you and the “clone” as well as between you and your “original” in the future. This definition also suggests a hole in your argument for strong time preference.
You are misreading me. I don’t have time preference. If an exact perfect replica of me were made, it would not be me even at the moment of duplication.
I have continuation-of-computation preference. This is much stricter than Eliezer’s causal connection based identity, but also avoids many weird predictions which arise from that.
And yes, you would need a bright line in this case. Fuzziness is in the map, not the territory on this item.
This looks like an argument for extreme time preference, not an argument against copies. Why identify with one million-years-later version of yourself and exclude the other, unless we beg the question?
That’s what I’m saying. I myself wouldn’t identify with any of the copies, no mater how near or distant. My clone and I have a lot in common, but were are separate sentient beings (hence: requesting suicide of the other is tantamount to murder). But if you do identify with clones (as in: they are you, not merely other beings that are similar to you), then at some point you and they must cross the line of divergence where they no longer are identifiable, or else the argument reduces to absurdity. Where is that line? I see no non-arbitrary way of defining it.
EDIT: which led me to suspect that other than intuition I have no reason to think that my clone and I share the same identity, which led me to consider other models for consciousness and identity. My terseness isn’t just because of the moral repugnance of asking others to suicide, but also because this is an old, already hashed argument. I first encountered it in philosophy class 10+ years ago. If there is a formal response to the reduction to absurdity I gave (which doesn’t also throw out consciousness entirely), I have yet to see it.
We certainly don’t need a bright line.
Maybe you already got this part, but time preference is orthogonal to copies vs originals.
Eliezer says he defines personal identity in part by causal connections, which exist between you and the “clone” as well as between you and your “original” in the future. This definition also suggests a hole in your argument for strong time preference.
You are misreading me. I don’t have time preference. If an exact perfect replica of me were made, it would not be me even at the moment of duplication.
I have continuation-of-computation preference. This is much stricter than Eliezer’s causal connection based identity, but also avoids many weird predictions which arise from that.
And yes, you would need a bright line in this case. Fuzziness is in the map, not the territory on this item.