I suspect from some of the comments I’m getting that I didn’t make this clear: copies are identical in this scenario. They receive the same inputs, make the same choices, they think and feel as one. They are, in short, one person (or civilization.) But one with vastly more reality-fluid (sometimes known as “measure”) and thus, as far as I can tell, moral weight.
Is it worse to torture a virtual person running on redundant hardware (say 3 computers in lock-step, like the Space Shuttle used) whose permanent state (or backups) is stored on a RAID1 of disks instead of a virtual person running on a single CPU with one disk? Or even simpler; is it worse to torture a more massive person than a less massive person? Personally, I would say no.
Just like there’s only one electron, I think there’s only one of any particular thing, at least in the map. The territory may actually be weird and strange, but I don’t have any evidence that redundant exact copies have as much moral weight as a single entity. I think that it’s worse to torture 1 non-redundant person than it is to torture n-1 out of n exact copies of that person, for any n. That only applies if it’s exactly the same simulation n-1 times. If those simulations start to diverge into n different persons, it starts to become as bad as torturing n different unique people. Eventually even those n-1 exact copies would diverge enough from the original to be considered copies of a different person with its own moral weight. My reasoning is just probabilistic in expected utility: It’s worse for an agent to expect p(torture)=1 than p(torture)=n-1/n, and an identical agent can’t distinguish between identical copies (including its environment) of itself.
As soon as you start torturing one of those identical agents, it ceases to be identical.
I guess the question from there is, does this produce a cascade of utility, as small divergences in the simulated universe produce slightly different agents for the other 6 billion people in the simulation, whose utility then exists independently?
That it is true, if unintuitive, that people gain moral worth the more “real” they get, is a position I have seen on LW, and the arguments do seem reasonable. (It is also rather more coherent when used in a Big Universe.) This post assumes that position, and includes a short version of the most common argument for that position.
Incidentally, I used to hold the position you describe; how do you deal with the fact that a tortured copy is, by definition, no longer “part” of the original?
But one with vastly more reality-fluid (sometimes known as “measure”) and thus, as far as I can tell, moral weight.
This is very thought provoking. Can you add clarity on your views on this point?
For instance, should I imply a “vastly” in front of moral weight as well as if there is a 1:1 correspondence or should I not do that?
Is this the only moral consideration you are considering on this tier? (I.E, there may be other moral considerations, but if this is the only “vast” one, it will probably outweigh all others.)
Does the arrangement of the copies reality fluid matter? Omega is usually thought of as a computer, so I am considering the file system. He might have 3 copies in 1 file for resilience, such as in a RAID array. Or he can have 3 copies that link to 3 files, such as in just having Sim001.exe and Sim002.exe and Sim003.exe having the exact same contents and being in the same folder. In both cases, the copies are identical. And if they are being run simultaneously and updated simultaneously, then the copies might not be able to tell which structure Omega was using. Which of these are you envisioning (or would it not matter? [Or do I not understand what a RAID array is?])
Some of these questions may be irrelevant, and if so, I apologize, I’m really am not sure I understand enough about your point to reply to it appropriately, and again, it does sound thought provoking.
For instance, should I imply a “vastly” in front of moral weight as well as if there is a 1:1 correspondence or should I not do that?
Pretty much, yeah.
Is this the only moral consideration you are considering on this tier? (I.E, there may be other moral considerations, but if this is the only “vast” one, it will probably outweigh all others.)
Well, I’m considering the torture’s disutility, and the torturers’ utility.
Does the arrangement of the copies reality fluid matter? Omega is usually thought of as a computer, so I am considering the file system. He might have 3 copies in 1 file for resilience, such as in a RAID array. Or he can have 3 copies that link to 3 files, such as in just having Sim001.exe and Sim002.exe and Sim003.exe having the exact same contents and being in the same folder. In both cases, the copies are identical. And if they are being run simultaneously and updated simultaneously, then the copies might not be able to tell which structure Omega was using. Which of these are you envisioning (or would it not matter? [Or do I not understand what a RAID array is?])
I’m not entirely sure I understand this question, but I don’t think it should matter.
I suspect from some of the comments I’m getting that I didn’t make this clear: copies are identical in this scenario. They receive the same inputs, make the same choices, they think and feel as one. They are, in short, one person (or civilization.) But one with vastly more reality-fluid (sometimes known as “measure”) and thus, as far as I can tell, moral weight.
Is it worse to torture a virtual person running on redundant hardware (say 3 computers in lock-step, like the Space Shuttle used) whose permanent state (or backups) is stored on a RAID1 of disks instead of a virtual person running on a single CPU with one disk? Or even simpler; is it worse to torture a more massive person than a less massive person? Personally, I would say no.
Just like there’s only one electron, I think there’s only one of any particular thing, at least in the map. The territory may actually be weird and strange, but I don’t have any evidence that redundant exact copies have as much moral weight as a single entity. I think that it’s worse to torture 1 non-redundant person than it is to torture n-1 out of n exact copies of that person, for any n. That only applies if it’s exactly the same simulation n-1 times. If those simulations start to diverge into n different persons, it starts to become as bad as torturing n different unique people. Eventually even those n-1 exact copies would diverge enough from the original to be considered copies of a different person with its own moral weight. My reasoning is just probabilistic in expected utility: It’s worse for an agent to expect p(torture)=1 than p(torture)=n-1/n, and an identical agent can’t distinguish between identical copies (including its environment) of itself.
As soon as you start torturing one of those identical agents, it ceases to be identical.
I guess the question from there is, does this produce a cascade of utility, as small divergences in the simulated universe produce slightly different agents for the other 6 billion people in the simulation, whose utility then exists independently?
That it is true, if unintuitive, that people gain moral worth the more “real” they get, is a position I have seen on LW, and the arguments do seem reasonable. (It is also rather more coherent when used in a Big Universe.) This post assumes that position, and includes a short version of the most common argument for that position.
Incidentally, I used to hold the position you describe; how do you deal with the fact that a tortured copy is, by definition, no longer “part” of the original?
This is very thought provoking. Can you add clarity on your views on this point?
For instance, should I imply a “vastly” in front of moral weight as well as if there is a 1:1 correspondence or should I not do that?
Is this the only moral consideration you are considering on this tier? (I.E, there may be other moral considerations, but if this is the only “vast” one, it will probably outweigh all others.)
Does the arrangement of the copies reality fluid matter? Omega is usually thought of as a computer, so I am considering the file system. He might have 3 copies in 1 file for resilience, such as in a RAID array. Or he can have 3 copies that link to 3 files, such as in just having Sim001.exe and Sim002.exe and Sim003.exe having the exact same contents and being in the same folder. In both cases, the copies are identical. And if they are being run simultaneously and updated simultaneously, then the copies might not be able to tell which structure Omega was using. Which of these are you envisioning (or would it not matter? [Or do I not understand what a RAID array is?])
Some of these questions may be irrelevant, and if so, I apologize, I’m really am not sure I understand enough about your point to reply to it appropriately, and again, it does sound thought provoking.
Pretty much, yeah.
Well, I’m considering the torture’s disutility, and the torturers’ utility.
I’m not entirely sure I understand this question, but I don’t think it should matter.