Suppose, we vivisect an entire universe full of simulated people. If there are enough people it might not matter, the utility might outweigh the costs.
That’s what your thread is right now. The reader is left baffled as to what utility you could possible be referring to; are we referring to the utility some lunatic gets from knowing that there are people getting vivisected? And are we disregarding the disutility of the people getting vivisected? Why is their disutility lower because there are more people in the universe? Does the absolute important of a single person decrease relative to absolute number of people?
You don’t discuss the medical knowledge, or whatever utility everybody else is getting, from these vivisections.
Ah.
I though I made that clear:
If we simulate an entire society a trillion times, or 3^^^^^^3 times, or some similarly vast number, and then simulate something horrific—an individual’s private harem or torture chamber or hunting ground—then the people in this simulation are not real. Their needs and desires are worth, not nothing, but far less then the merest whims of those who are Really Real. They are, in effect, zombies—not quite p-zombies, since they are conscious, but e-zombies—reasoning, intelligent beings that can talk and scream and beg for mercy but do not matter.
I think I may have aid too much emphasis on the infinitesmally small Reality of the victims, as opposed to the Reality of the citizens.
Retracted last comment because I realized I was misreading what you were saying.
Let me approach this from another direction:
You’re basically supposing that 1/N odds of being tortured is morally equivalent to 1/N odds of being tortured with an implicit guarantee that somebody is going to get tortured. I think it is consistent to regard that 1/N for some sufficiently large N odds of me being tortured is less important than 1/N people actually being tortured.
If you create a precise duplicate of the universe in a simulation, I don’t regard that we have gained anything; I consider that two instances of indistinguishable utility aren’t cumulative. If you create a precise duplicate of me in a simulation and then torture that duplicate, utility decreases.
This may seem to be favoring “average” utility, but I think the distinction is in the fact that torturing an entity represents, not lower utility, but disutility; because I regard a duplicate universe as adding no utility, the negative utility shows up as a net loss.
I’d be hard-pressed to argue about the “indistinguishability” part, though I can sketch where the argument would lay; because utility exists as a product of the mind, and duplicate minds are identical from an internal perspective, an additional indistinguishable mind doesn’t add anything. Of course, this argument may require buying into the anthropic perspective.
If you create a precise duplicate of the universe in a simulation, I don’t regard that we have gained anything; I consider that two instances of indistinguishable utility aren’t cumulative. If you create a precise duplicate of me in a simulation and then torture that duplicate, utility decreases.
This may seem to be favoring “average” utility, but I think the distinction is in the fact that torturing an entity represents, not lower utility, but disutility; because I regard a duplicate universe as adding no utility, the negative utility shows up as a net loss.
I’m basically assuming this reality-fluid stuff is legit for the purposes of this post. I included the most common argument in it’s favor (the probability argument) but I’m not setting out to defend it, I’m just exploring the consequences.
If you’re in a simulation right now, how would you feel about those running the machine simulating you? Do you grant them moral sanction to do whatever they like with you, because you’re less than them?
I mean, maybe you’re here as a representative of the people running the machine simulating me. I’m not sure I like where your train of thought is going, in that case.
Ah.
I though I made that clear:
I think I may have aid too much emphasis on the infinitesmally small Reality of the victims, as opposed to the Reality of the citizens.
I’m puzzled at to why they should matter less.
Because they are less.
Retracted last comment because I realized I was misreading what you were saying.
Let me approach this from another direction:
You’re basically supposing that 1/N odds of being tortured is morally equivalent to 1/N odds of being tortured with an implicit guarantee that somebody is going to get tortured. I think it is consistent to regard that 1/N for some sufficiently large N odds of me being tortured is less important than 1/N people actually being tortured.
If you create a precise duplicate of the universe in a simulation, I don’t regard that we have gained anything; I consider that two instances of indistinguishable utility aren’t cumulative. If you create a precise duplicate of me in a simulation and then torture that duplicate, utility decreases.
This may seem to be favoring “average” utility, but I think the distinction is in the fact that torturing an entity represents, not lower utility, but disutility; because I regard a duplicate universe as adding no utility, the negative utility shows up as a net loss.
I’d be hard-pressed to argue about the “indistinguishability” part, though I can sketch where the argument would lay; because utility exists as a product of the mind, and duplicate minds are identical from an internal perspective, an additional indistinguishable mind doesn’t add anything. Of course, this argument may require buying into the anthropic perspective.
I’m basically assuming this reality-fluid stuff is legit for the purposes of this post. I included the most common argument in it’s favor (the probability argument) but I’m not setting out to defend it, I’m just exploring the consequences.
Why?
If you’re in a simulation right now, how would you feel about those running the machine simulating you? Do you grant them moral sanction to do whatever they like with you, because you’re less than them?
I mean, maybe you’re here as a representative of the people running the machine simulating me. I’m not sure I like where your train of thought is going, in that case.
Honestly, I would have upvoted just for this bit.