Huh. Can you clarify exactly why it matters? That is… I recognize that on a superficial level it feels like it matters, so if you’re making a point about how to manipulate human psychology, then I understand that. OTOH, if you’re making an ethical point about the value of life, I don’t quite understand why the value of those 400 lives is dependent on how many people there are in… well, in what? The world? The galaxy? The observable universe? The unobservable universe? Other?
I’m making a point about human psychology. The value of a life obviously does not change.
Although, I suppose theoretically, if the concern is not over individual lives, but over the survival of the species as a whole, and there are only 500 people to be saved, then picking the 400 option would make sense.
To clarify, that’s how many people in “The world? The galaxy? The observable universe? The unobservable universe? Other?” are going to die. You can save a maximum of 500 in this manner.
So I have a choice between
A. “Save 400 lives, allow (N-400) people to die, with certainty.” and B. “Save 500 lives (allow N-500 people to die), 90% probability; save no lives (allow N people to die), 10% probability.”
Are you suggesting that my choice between A and B ought to depend on N? If so, why?
It doesn’t depend on N if N is consistent between options A and B, but it would if they were different. It would make for an odd hypothetical scenario, but I was just saying that it’s not made completely explicit.
Huh.
Can you clarify exactly why it matters?
That is… I recognize that on a superficial level it feels like it matters, so if you’re making a point about how to manipulate human psychology, then I understand that.
OTOH, if you’re making an ethical point about the value of life, I don’t quite understand why the value of those 400 lives is dependent on how many people there are in… well, in what? The world? The galaxy? The observable universe? The unobservable universe? Other?
I’m making a point about human psychology. The value of a life obviously does not change.
Although, I suppose theoretically, if the concern is not over individual lives, but over the survival of the species as a whole, and there are only 500 people to be saved, then picking the 400 option would make sense.
Well, if there are only 400 people in the universe, option 1 means you’re saving them all and nobody needs die.
But that’s a rather silly interpretation. That the option 2 exists obviously means there exist at least 500 people in the universe.
I agree with all of this.
To clarify, that’s how many people in “The world? The galaxy? The observable universe? The unobservable universe? Other?” are going to die. You can save a maximum of 500 in this manner.
Um.
OK… I still seem to be missing the point.
So I have a choice between A. “Save 400 lives, allow (N-400) people to die, with certainty.” and
B. “Save 500 lives (allow N-500 people to die), 90% probability; save no lives (allow N people to die), 10% probability.”
Are you suggesting that my choice between A and B ought to depend on N?
If so, why?
It doesn’t depend on N if N is consistent between options A and B, but it would if they were different. It would make for an odd hypothetical scenario, but I was just saying that it’s not made completely explicit.