The closest thing to teleportation I can imagine is uploading my mind and sending the information to my intended destination at lightspeed. I wouldn’t mind if once the information was copied the teleporter deleted the old copy. If instead of 1 copy, the teleporter made 50 redundant copies just in case, and destroyed 49 once it was confirmed the teleportation was successful, would that be like killing me 49 times? Are 50 copies of the same mind being tortured any different than 1 mind being tortured? I do not think so. It is just redundant information, there is no real difference in experience. Thus, in my mind, only 1 of the 50 minds matter (or the 50 minds are essentially 1 mind). The degree to which the other 49 matter is only equal to the difference in information they encode. (Of course, a superintelligence would see about as much relative difference in information between humans as we humans see in ants; but we must take an anthropocentric view of state complexity.)
The me in other quantum branches can be very, very similar to the me in this one. I don’t mind dying in one quantum branch all that much if the me not-dying in other quantum branches is very similar to the me that is dying. The reason I would like there to be more mes in more quantum branches is because other people care about the mes. That is why I wouldn’t play quantum immortality games (along with the standard argument that in the vast majority of worlds you would end up horribly maimed.)
If the additional identical copies count for something, despite my intuitions, at the very least I don’t think their value should aggregate linearly. I would hazard a guess that a utility function which does that has something wrong with it. If you had 9 identical copies of Bob and 1 copy of Alice, and you had to kill off 8 copies, there must be some terminal value for complexity that keeps you from randomly selecting 8, and instead automatically decides to kill off 8 Bobs (given that Alice isn’t a serial killer, utility of Alice and Bob being equal, yada yada yada.)
I think that maybe instead of minds it would be easier and less intuition-fooling to think about information. I also think that, like I said, I am probably missing the point of the post.
Do you consider a mind that has been tortured identical to one that has not? Won’t the torture process add non-trivial differences, to the point where the minds don’t count as identical?
It’s not a binary distinction. If an identical copy was made of one mind and tortured, while the other instance remained untortured, they would start to differentiate into distinct individuals. As rate of divergence would increase with degree of difference in experience, I imagine torture vs non-torture would spark a fairly rapid divergence.
I haven’t had opportunity to commit to reading Bostrom’s paper, but in the little I did read Bostrom thought it was “prima facie implausible and farfetched to maintain that the wrongness of torturing somebody would be somehow ameliorated or annulled if there happens to exist somewhere an exact copy of that person’s resulting brain-state.” That is, it seemed obvious to Bostrom that having two identical copies of a tortured individual must be worse than one instance of a tortured individual (actually twice as bad, if I interpret correctly). That does not at all seem obvious to me, as I would consider two (synchronized) copies to be one individual in two places. The only thing worse about having two copies that occurs to me is a greater risk of divergence, leading to increasingly distinct instances.
Are you asking whether it would be better to create a copy of a mind and torture it rather than not creating a copy and just getting on with the torture? Well, yes. It’s certainly worse than not torturing at all, but it’s not as bad as just torturing one mind. Initially, the individual would half-experience torture. Fairly rapidly later, the single individual will separate into two minds, one being tortured and one not. This is arguably still better from the perspective of the pre-torture mind than the single-mind-single-torture scenario, since at least half the mind’s experiences downstream is not-tortured, vs 100%-torture in other case.
If this doesn’t sound convincing, consider a twist: would you choose to copy and rescue a mind-state from someone about to, say, be painfully sucked into a black hole, or would it be ethically meaningless to create a non-sucked-into-black-hole copy? Granted, it would be best to not have anyone sucked into a black hole, but suppose you had to choose?
Looks to me like Bostrom is trying to make the point that duplication of brain-states, by itself and devoid of other circumstances, is not sufficient to make the act of torture moral, or less harmful.
After reading through the paper, it looks to me like we’ve moved outside of what Bostrom was trying to address, here. If synchronized brains lose individuality, and/or an integration process takes place, leading to a brain-state which has learned from the torture experience but remains unharmed, move the argument outside the realm of what Bostrom was trying to address.
I agree with Bostrom on this point. It looks to me like, if Yorik is dismissing 49 tortured copies as inconsequential, he must also show that there is a process where the knowledge accumulated by each of the 49 copies is synchronized and integrated into the remaining one copy, without causing that one copy (or anyone else, for that matter) any harm. Or, there must be some other assumptions that he is making about the copies that remove the damage caused by copying—copying alone can’t remove responsibility for the killing of the copies.
For the black-hole example, copying the person about to be sucked into the hole is not ethically meaningless. The value of the copy, though, comes from its continued existence. The act of copying does not remove moral consequences from the sucking-in-the-black-hole act. If there is an agent X which pushed the copy into the black hole, that agent is just as responsible for his actions if he doesn’t copy the individual at the last minute, as he would be if he does make a copy.
Can you please point me to Bostrom’s paper? I can’t seem to find the reference.
I’m very curious if the in-context quote is better fleshed out. As it stands here, it looks a lot like it’s affected by anthropomorphic bias (or maybe references a large number of hidden assumptions that I don’t share, around both the meaning of individuality and the odds that intelligences which regularly undergo synchronization can remain similar to ours).
I can imagine a whole space of real-life, many-integrated-synchronized-copies scenarios, where the process of creating a copy and torturing it for kicks would be accepted, commonplace and would not cause any sort of moral distress. To me, there is a point where torture and/or destruction of a synchronized, integrated, identical copy transition into the same moral category as body piercings and tatoos.
I am probably in way over my head here, but…
The closest thing to teleportation I can imagine is uploading my mind and sending the information to my intended destination at lightspeed. I wouldn’t mind if once the information was copied the teleporter deleted the old copy. If instead of 1 copy, the teleporter made 50 redundant copies just in case, and destroyed 49 once it was confirmed the teleportation was successful, would that be like killing me 49 times? Are 50 copies of the same mind being tortured any different than 1 mind being tortured? I do not think so. It is just redundant information, there is no real difference in experience. Thus, in my mind, only 1 of the 50 minds matter (or the 50 minds are essentially 1 mind). The degree to which the other 49 matter is only equal to the difference in information they encode. (Of course, a superintelligence would see about as much relative difference in information between humans as we humans see in ants; but we must take an anthropocentric view of state complexity.)
The me in other quantum branches can be very, very similar to the me in this one. I don’t mind dying in one quantum branch all that much if the me not-dying in other quantum branches is very similar to the me that is dying. The reason I would like there to be more mes in more quantum branches is because other people care about the mes. That is why I wouldn’t play quantum immortality games (along with the standard argument that in the vast majority of worlds you would end up horribly maimed.)
If the additional identical copies count for something, despite my intuitions, at the very least I don’t think their value should aggregate linearly. I would hazard a guess that a utility function which does that has something wrong with it. If you had 9 identical copies of Bob and 1 copy of Alice, and you had to kill off 8 copies, there must be some terminal value for complexity that keeps you from randomly selecting 8, and instead automatically decides to kill off 8 Bobs (given that Alice isn’t a serial killer, utility of Alice and Bob being equal, yada yada yada.)
I think that maybe instead of minds it would be easier and less intuition-fooling to think about information. I also think that, like I said, I am probably missing the point of the post.
Do you consider a mind that has been tortured identical to one that has not? Won’t the torture process add non-trivial differences, to the point where the minds don’t count as identical?
It’s not a binary distinction. If an identical copy was made of one mind and tortured, while the other instance remained untortured, they would start to differentiate into distinct individuals. As rate of divergence would increase with degree of difference in experience, I imagine torture vs non-torture would spark a fairly rapid divergence.
I haven’t had opportunity to commit to reading Bostrom’s paper, but in the little I did read Bostrom thought it was “prima facie implausible and farfetched to maintain that the wrongness of torturing somebody would be somehow ameliorated or annulled if there happens to exist somewhere an exact copy of that person’s resulting brain-state.” That is, it seemed obvious to Bostrom that having two identical copies of a tortured individual must be worse than one instance of a tortured individual (actually twice as bad, if I interpret correctly). That does not at all seem obvious to me, as I would consider two (synchronized) copies to be one individual in two places. The only thing worse about having two copies that occurs to me is a greater risk of divergence, leading to increasingly distinct instances.
Are you asking whether it would be better to create a copy of a mind and torture it rather than not creating a copy and just getting on with the torture? Well, yes. It’s certainly worse than not torturing at all, but it’s not as bad as just torturing one mind. Initially, the individual would half-experience torture. Fairly rapidly later, the single individual will separate into two minds, one being tortured and one not. This is arguably still better from the perspective of the pre-torture mind than the single-mind-single-torture scenario, since at least half the mind’s experiences downstream is not-tortured, vs 100%-torture in other case.
If this doesn’t sound convincing, consider a twist: would you choose to copy and rescue a mind-state from someone about to, say, be painfully sucked into a black hole, or would it be ethically meaningless to create a non-sucked-into-black-hole copy? Granted, it would be best to not have anyone sucked into a black hole, but suppose you had to choose?
Looks to me like Bostrom is trying to make the point that duplication of brain-states, by itself and devoid of other circumstances, is not sufficient to make the act of torture moral, or less harmful.
After reading through the paper, it looks to me like we’ve moved outside of what Bostrom was trying to address, here. If synchronized brains lose individuality, and/or an integration process takes place, leading to a brain-state which has learned from the torture experience but remains unharmed, move the argument outside the realm of what Bostrom was trying to address.
I agree with Bostrom on this point. It looks to me like, if Yorik is dismissing 49 tortured copies as inconsequential, he must also show that there is a process where the knowledge accumulated by each of the 49 copies is synchronized and integrated into the remaining one copy, without causing that one copy (or anyone else, for that matter) any harm. Or, there must be some other assumptions that he is making about the copies that remove the damage caused by copying—copying alone can’t remove responsibility for the killing of the copies.
For the black-hole example, copying the person about to be sucked into the hole is not ethically meaningless. The value of the copy, though, comes from its continued existence. The act of copying does not remove moral consequences from the sucking-in-the-black-hole act. If there is an agent X which pushed the copy into the black hole, that agent is just as responsible for his actions if he doesn’t copy the individual at the last minute, as he would be if he does make a copy.
Can you please point me to Bostrom’s paper? I can’t seem to find the reference.
I’m very curious if the in-context quote is better fleshed out. As it stands here, it looks a lot like it’s affected by anthropomorphic bias (or maybe references a large number of hidden assumptions that I don’t share, around both the meaning of individuality and the odds that intelligences which regularly undergo synchronization can remain similar to ours).
I can imagine a whole space of real-life, many-integrated-synchronized-copies scenarios, where the process of creating a copy and torturing it for kicks would be accepted, commonplace and would not cause any sort of moral distress. To me, there is a point where torture and/or destruction of a synchronized, integrated, identical copy transition into the same moral category as body piercings and tatoos.
Quantity of experience: brain-duplication and degrees of consciousness