I agree, intuition is very difficult here. In this specific scenario, I’d lean towards saying yes—it’s the same person with a physically different body and brain, so I’d like to think that there is some continuity of the “person” in that situation. My brain isn’t made of the “same atoms” it was when I was born, after all. So I’d say yes. In fact, in practice, I would definitely assume said robot and software to have moral value, even if I wasn’t 100% sure.
However, if the original brain and body weren’t destroyed, and we now had two apparently identical individuals claiming to be people worthy of moral respect, then I’d be more dubious. I’d be extremely dubious of creating twenty robots running identical software (which seems entirely possible with the technology we’re supposing) and assigning them the moral status of twenty people. “People”, of the sort deserving of rights and dignity and so forth, shouldn’t be the sort of thing that can be arbitrarily created through a mechanical process. (And yes, human reproduction and growth is a mechanical process, so there’s a problem there too.)
Actually, come to think of it… if you have two copies of software (either electronic or neuron-based) running on two separate machines, but it’s the same software, could they be considered the same person? After all, they’ll make all the same decisions given similar stimuli, and thus are using the same decision process.
Yes, the consensus seems to be that running two copies of yourself in parallel doesn’t give you more measure or moral weight. But if the copies receive diferent inputs, they’ll eventually (frantic handwaving) diverge into two different people who both matter. (Maybe when we can’t retrieve Copy-A’s current state from Copy-B’s current state and the respective inputs, because information about the initial state has been destroyed?)
Have you read the quantum physics sequence? Would you agree with me that nothing you learn about seemingly unrelated topics like QM should have the power to destroy the whole basis of your morality?
I agree, intuition is very difficult here. In this specific scenario, I’d lean towards saying yes—it’s the same person with a physically different body and brain, so I’d like to think that there is some continuity of the “person” in that situation. My brain isn’t made of the “same atoms” it was when I was born, after all. So I’d say yes. In fact, in practice, I would definitely assume said robot and software to have moral value, even if I wasn’t 100% sure.
However, if the original brain and body weren’t destroyed, and we now had two apparently identical individuals claiming to be people worthy of moral respect, then I’d be more dubious. I’d be extremely dubious of creating twenty robots running identical software (which seems entirely possible with the technology we’re supposing) and assigning them the moral status of twenty people. “People”, of the sort deserving of rights and dignity and so forth, shouldn’t be the sort of thing that can be arbitrarily created through a mechanical process. (And yes, human reproduction and growth is a mechanical process, so there’s a problem there too.)
Actually, come to think of it… if you have two copies of software (either electronic or neuron-based) running on two separate machines, but it’s the same software, could they be considered the same person? After all, they’ll make all the same decisions given similar stimuli, and thus are using the same decision process.
Yes, the consensus seems to be that running two copies of yourself in parallel doesn’t give you more measure or moral weight. But if the copies receive diferent inputs, they’ll eventually (frantic handwaving) diverge into two different people who both matter. (Maybe when we can’t retrieve Copy-A’s current state from Copy-B’s current state and the respective inputs, because information about the initial state has been destroyed?)
Have you read the quantum physics sequence? Would you agree with me that nothing you learn about seemingly unrelated topics like QM should have the power to destroy the whole basis of your morality?