Agreed that introducing knock-on effects (starvation and so forth) is significantly changing the scenario. I endorse ignoring that.
Given seven billion one-legged people and one zero-legged person, and the ability to wave a magic wand and cure either the zero-legged person or the 6,999,999,999 one-legged people, I heal the one-legged people.
That’s true even if I have the two broken legs. That’s true even if I will get to heal the other set later (as is implied by your use of the word “first”).
If I’ve understood you correctly, you commit to using the wand to healing my legs instead of healing everyone else.
If that’s true, I will do my best to keep that wand out of your hands.
I would devote an amount of energy to avoiding that scenario that seemed commensurate with its expected value. Indeed, I’m doing so right now (EDIT: actually, on consideration, I’m devoting far more energy to it than it merits). If my estimate of the likelihood of you obtaining such a wand (and, presumably, finding the one person in the world who is suffering incrementally more than anyone else and alleviating his or her suffering with it) increases, the amount of energy I devote to avoiding it might also increase.
Agreed that introducing knock-on effects (starvation and so forth) is significantly changing the scenario. I endorse ignoring that.
Given seven billion one-legged people and one zero-legged person, and the ability to wave a magic wand and cure either the zero-legged person or the 6,999,999,999 one-legged people, I heal the one-legged people.
That’s true even if I have the two broken legs.
That’s true even if I will get to heal the other set later (as is implied by your use of the word “first”).
If I’ve understood you correctly, you commit to using the wand to healing my legs instead of healing everyone else.
If that’s true, I will do my best to keep that wand out of your hands.
So, you would do everything you can, to prevent a small probability, but very bad scenario? Wouldn’t you just neglect it?
I would devote an amount of energy to avoiding that scenario that seemed commensurate with its expected value. Indeed, I’m doing so right now (EDIT: actually, on consideration, I’m devoting far more energy to it than it merits). If my estimate of the likelihood of you obtaining such a wand (and, presumably, finding the one person in the world who is suffering incrementally more than anyone else and alleviating his or her suffering with it) increases, the amount of energy I devote to avoiding it might also increase.