Yes, thinking about it more the only policy i found that didn’t lead to problems was (in the case where the cloning happens after the act):
All instances of the person should be regarded as culpable as much as you would regard the person if they didn’t clone themselves.
Otherwise if not both of them are culpable you get the star trek teleporter problem.
And if you divide culpability it means someone can reduce their own punishment 50 fold by creating 50 clones. and if they aren’t sympathetic to the clone’s suffering, they might do just that.
The sad thing about this is policy is having to multiply the amount of suffering experienced by the punishment. you would much rather someone not create a clone after committing a crime, cause then you’ll have to punish multiple people. maybe in such a future cloning would be a crime for someone who previously committed a crime.
The sad thing about this is policy is having to multiply the amount of suffering experienced by the punishment.
There’s a missing step in this result. Moral culpability is about judgement and condemnation of actions (and the actors who performed them), not (necessarily) about punishment. Calculation of optimal punishment is about influencing FUTURE actions, not about judging past actions. It’s not fully disconnected from past culpability, but it’s not at all the same thing.
You may have to increase total suffering, but you may not—perhaps punishing one clone randomly is sufficient to achieve the punishment goals (deterring future bad actions by that decision-maker and by observers). Even if there’s more summed punishment needed to have the same level of deterrence, presumably the clones increased total joy as well, and the net moments of lives-worth-living is somewhat increased.
Now if the cloning ITSELF is a moral wrong (say, it uses resources in a way that causes unjustified harm to others), you pretty much have to overreact—make it far more painful to all the clones, and more painful for more clones. But I’d argue that the culpability for the punishment pain falls on the clones as well, rather than the judge or hangman.
How can you be culpable of an act that you didn’t commit? The !as doesn’t punish people just for being the kind of person who would commit an evil act.
Well, for one, they wouldn’t just be the kind of person who would commit an evil act, they’re the kind of person who did commit an evil act.
No, they are a clone of a person who did commit an evil act. You can’t claim that they are very same person in the sense of numerical identity. (Numerical identity
is the kind of identity that does not hold between identical twins. Identical twins are not the same person. They are two identical people).
Cloning isnt numerical identity. A clone is an artificial twin and twins are not numerically identical. Numerical identity relates to aliasing, to having two labels for the same entity. Stephanie Germanicus and Lady Gaga are numerically identical.
Ok, so i confused the term with something else, oops. my point is that I’m talking about an exact copy, in the sense discussed in the quantum mechanics and personal identity sequence.
The exactness of the copy doesn’t matter. If a twin commits a crime, the other twin is not held responsible , because they did not commit the crime, not because there is some minute difference between them.
Yes, thinking about it more the only policy i found that didn’t lead to problems was (in the case where the cloning happens after the act):
All instances of the person should be regarded as culpable as much as you would regard the person if they didn’t clone themselves.
Otherwise if not both of them are culpable you get the star trek teleporter problem.
And if you divide culpability it means someone can reduce their own punishment 50 fold by creating 50 clones. and if they aren’t sympathetic to the clone’s suffering, they might do just that.
The sad thing about this is policy is having to multiply the amount of suffering experienced by the punishment. you would much rather someone not create a clone after committing a crime, cause then you’ll have to punish multiple people. maybe in such a future cloning would be a crime for someone who previously committed a crime.
There’s a missing step in this result. Moral culpability is about judgement and condemnation of actions (and the actors who performed them), not (necessarily) about punishment. Calculation of optimal punishment is about influencing FUTURE actions, not about judging past actions. It’s not fully disconnected from past culpability, but it’s not at all the same thing.
You may have to increase total suffering, but you may not—perhaps punishing one clone randomly is sufficient to achieve the punishment goals (deterring future bad actions by that decision-maker and by observers). Even if there’s more summed punishment needed to have the same level of deterrence, presumably the clones increased total joy as well, and the net moments of lives-worth-living is somewhat increased.
Now if the cloning ITSELF is a moral wrong (say, it uses resources in a way that causes unjustified harm to others), you pretty much have to overreact—make it far more painful to all the clones, and more painful for more clones. But I’d argue that the culpability for the punishment pain falls on the clones as well, rather than the judge or hangman.
How can you be culpable of an act that you didn’t commit? The !as doesn’t punish people just for being the kind of person who would commit an evil act.
Well, for one, they wouldn’t just be the kind of person who would commit an evil act, they’re the kind of person who did commit an evil act.
But ok, how do you suggest solving the two evasion tactics described?
No, they are a clone of a person who did commit an evil act. You can’t claim that they are very same person in the sense of numerical identity. (Numerical identity is the kind of identity that does not hold between identical twins. Identical twins are not the same person. They are two identical people).
I’m not talking about identical twins, I’m talking exactly about numerical identity. perfect cloning.
Cloning isnt numerical identity. A clone is an artificial twin and twins are not numerically identical. Numerical identity relates to aliasing, to having two labels for the same entity. Stephanie Germanicus and Lady Gaga are numerically identical.
Ok, so i confused the term with something else, oops. my point is that I’m talking about an exact copy, in the sense discussed in the quantum mechanics and personal identity sequence.
The exactness of the copy doesn’t matter. If a twin commits a crime, the other twin is not held responsible , because they did not commit the crime, not because there is some minute difference between them.
Ok, how do you respond to this formulation of the problem?