To prefer a 60% chance of the destruction of more than two existences to the certainty of the extinction of humanity in one of them is an interesting position.
Clearly, however, either such a preference either incurs local privilege, or it should be just as logical to prefer the 60% destruction of more than everything over the certain destruction of a different simulation, one that would never have interaction with the one that the agent experiences.
To prefer a 60% chance of the destruction of more than two existences to the certainty of the extinction of humanity in one of them is an interesting position.
Yes, far from inconceivable and perhaps even held coherently by a majority of humans but certainly different to mine. I have decidedly different preferences, in certain cases it’s less than that. If I found I was in certain kinds of simulations I’d value my own existence either less or not at all.
Clearly, however, either such a preference either incurs local privilege
Yes, it would (assuming I understand correctly what you mean by that).
I hadn’t considered the angle that the simulation might be run by an actively hostile entity; in that case, destroying the hostile entity (ending the simulation) is the practical thing to do at the top layer, and also the desired result in the simulation (end of simulation rather than torture).
Undefined. Legitimate and plausible consequentialist value systems can be conceived that go either way.
To prefer a 60% chance of the destruction of more than two existences to the certainty of the extinction of humanity in one of them is an interesting position.
Clearly, however, either such a preference either incurs local privilege, or it should be just as logical to prefer the 60% destruction of more than everything over the certain destruction of a different simulation, one that would never have interaction with the one that the agent experiences.
Yes, far from inconceivable and perhaps even held coherently by a majority of humans but certainly different to mine. I have decidedly different preferences, in certain cases it’s less than that. If I found I was in certain kinds of simulations I’d value my own existence either less or not at all.
Yes, it would (assuming I understand correctly what you mean by that).
I hadn’t considered the angle that the simulation might be run by an actively hostile entity; in that case, destroying the hostile entity (ending the simulation) is the practical thing to do at the top layer, and also the desired result in the simulation (end of simulation rather than torture).