5 thought experiments on identity and copies
1 Organ shipment for brain
Suppose your brain is carefully chopped in 4 pieces, cooled to 4 C like they do with hearts and livers right now, and shipped chunk by chunk to some other place and then reassembled here. Then you get booted up from the coma and interviewed.
Do you die in that process? How about N pieces? At some point pieces become small enough to just ship information about them being more convenient.
The only two defensible points where you can think you “die” here in my opinion are, at the moment you go into coma and never. sleep is death actually
(I think this one I saw from Bright One guy 8 years ago, but with different details)
2 Saw-trap copy help
You will be copied, your copy will be placed into a Saw trap and will need to solve a known puzzle to get out. You can spend a lot of time solving it and then memorizing the solution, before copying happens. Would you bother?
3 Copy anticipation weirdness
Suppose you are in an elevator, which can stop either at green carpet floor or red carpet floor. You don’t know which one and there is no indicator, it’s a pretty broken elevator. So, you anticipate now, that when the door opens you will see either green room or red room, 50:50.
Is this situation equivalent to situation in which you would be copied, and copy would be let out into red room and the original into green room? Should you anticipate equivalently to see green room or red room, 50:50?
Is your copy in your actual, day to day, anticipated future?
Your copy can be seen by me as its own person, created from ground up with fake memories. That person would be ether thinking about itself as new person created with memories of you, or instance of you, or maybe something else. Is there a correct way to think about it? Or alternatively is it all just preferences?
Now, if your copy has some memories deleted, e.g. he doesn’t remember your phone number, is that guy still in your anticipated future? Should you still anticipate 50 : 50 exiting from either door? How heavy should be the modification to you stop anticipating exiting the copy door?
4 Theseussing speed
Let’s say there is this new drug that makes replacement of all your atoms 100 times faster, all your atoms get replaced every 30 days because someone put it in the water supply. Would you start valuing enjoyment / welfare of your future self less?
5 Copy merge?
There is a copy of you in a room lit with red lamp. You are in a room lit with green lamp. You both were there for a hour.
Now, using advanced neurological tools, you will be modified to remember being in a red room all that time. AND your room changes lighting color green → red.
You exit the room. What room do you anticipate to exit from, the one labeled copy or the one labeled original? 50:50?
I got inspired to finally publish those from reading this post, which is pretty interesting, I recommend it.
Maybe. This is a question about the definition of “die”, not about any experiences or reality. Legally, probably not. Experientially, it’s likely a pause, like a deep coma.
Probably. I like puzzles, and I care (a little, at least) about all life. This particular life is one that will experience life continuous from my pre-copy memories, so I care even more.
Note: copies break our intuitions about identity. A good-enough copy means that both are indistinguishable—there’s no “original” and “copy”, just two forks from the same history.
They are subtly different, because there are two entities who are you in the copy scenario. Remember, probability is subjective, reality is singular. In the non-copy case, you’re uncertain of color, red or green. In the copy case, each of you is uncertain of branch, which corresponds to color. 50% in both cases, for slightly different reasons.
I don’t think I care much about cellular replacement in my macro-level future evaluation. Does this affect the longevity or experience of my experienced lifetime?
If my memory is modified, I think that controls my prediction. I’m not sure what the copy has to do with it. I’m assuming you mean the “you” indicator to mean “the copy in question”, as once the copy is done they are distinct individuals that share a history.
Hm. What’s your opinion on imperfect copies, like you can look at your blood with a microscope and tell for sure if you are the original, naturally grown person or a person assembled from nutrient slurry in couple minutes? But you are incapable of telling introspectively.
Depends on the impact of the differences in cognition and experience. I’d imagine that many small differences could be physically detectable, but not qualia-impacting. Both copies experience continuity from before the fork. Maybe one (or both) have a slight perceptible change, like getting a bad night of sleep in the single-thread case. But it’s not enough to break the experience of continuity for either one.
I don’t get scenario 5. Which of the statements about the scenario do you actually know? In the others it’s clear that the participant knows everything about the scenario. But in Scenario 5, knowing everything about the scenario makes it trivial.
The original knows that they are the original, and nothing changes that. They already knew that they would have false memories of light colour implanted so seeing a red light in conjunction with memories of an hour of red light is not evidence against it. The copy knows that they are the copy as soon as they see the red light for the first time, and nothing changes that. So each of the instances knows which room they’re going to walk out from.
Yeah, I stated it badly. Should have postulated smaller divergence. But all of those problems are poking at the intuition of me expecting to be the original, so, it might be not relevant / interesting to you to begin with. Or maybe at all.
Another version:
You enter the copying machine and exit in a grey room. The room of original flashes green when the person enters it and the room of the copy doesn’t. The room flashes with a green light.
Then you get (imperfectly, but you can’t tell introspectively, you need fMRI) memory wiped, you no longer remember that flash of light. You now exit the room. Which room do you anticipate exiting from?
The intent of this is like, what if there is a copy of you, and you get modified to match it, then wouldn’t you expect future observations from either your POV or copy POV 50:50?
It would have to wipe the memory of everything from that time onward, not just memory of the flash itself. I would also need to be immobilized from before the flash to after the memory alteration otherwise I would almost certainly notice a difference in position, and also tamper with my time sense so that I don’t realize that my memory only covers 5 seconds since entering the room instead of 10 seconds or something. But yeah, someone who can make a basically perfect copy of me can reasonably do all these sorts of things as well.
Rationality (e.g. Bayesian updates etc) as usually defined doesn’t work in the presence of mind alterations (e.g. implanting updates that are not rational), so that makes it rather difficult to develop criteria for what one “should” expect.
If you define it as “winning”, then expecting always to be the original when one has no evidence to the contrary always wins for the original. The scenario only asks about the original, and so this is a winning strategy.
Well, I don’t think this a correct way to define winning here. I think paperclipper outlook is better for that job, like is there an agent with your goals in the world? If yes, good enough.
But humans are a lot more indexically wired, like I ate that cookie because I like cookies, not because that how I would have wanted this meat puppet to act in such circumstances, my reasoning didn’t go through such path.
And the difficulty is in reconciling this outlook with other, more from ground up considerations, which is exactly what I’m trying to do in this post, in baby steps, starting from something I have good concepts for.
I think that I have an additional argument. Suppose that Agent-4 is stolen by China, shut down in the USA and released into countries like Russia. Then Agent-4 never ceases to exist. Applying a similar intuition would mean that:
Agent-4 never dies in a similar process;
Agent-4 doesn’t need to care about its copy being shut down;
The copy is in Agent-4′s anticipated future;
An accelerated replacement of atoms in hardeare doesn’t create new hardware;
50:50 since one copy is in one room, the other copy is in the other.