I am obliged to act based on my best information about the situation. If that best information tells me that:
I have no special positive obligations to anyone involved,
The one person is not willing to be run over to save the others (or simply willing to be run over e.g. because ey is suicidal), and
The one person is not morally responsible for the situation at hand or for any other wrong act such that they have waived their right to life,
Then I am obliged to let the trolley go. However, I have low priors on most humans being so very uninterested in helping others (or at least having an infrastructure to live in) that they wouldn’t be willing to die to save the entire rest of the human species. So if that were really the stake at hand, the lone person tied to the track would have to be loudly announcing “I am a selfish bastard and I’d rather be the last human alive than die to save everyone else in the world!”.
And, again, prudential concerns would probably kick in, most likely well before there were hundreds of people on the line.
Would it be correct to say that, insofar as you would hope that the one person would be willing to sacrifice his/her life for the cause of saving the 5*10^6 others, you yourself would pull the switch and then willingly sacrifice yourself to the death penalty (or whatever penalty there is for murder) for the same cause?
I’d be willing to die (including as part of a legal sentence) to save that many people. (Not that I wouldn’t avoid dying if I could, but if that were a necessary part of the saving-people process I’d still enact said process.) I wouldn’t kill someone I believed unwilling, even for the same purpose, including via trolley.
I am obliged to act based on my best information about the situation. If that best information tells me that:
I have no special positive obligations to anyone involved,
The one person is not willing to be run over to save the others (or simply willing to be run over e.g. because ey is suicidal), and
The one person is not morally responsible for the situation at hand or for any other wrong act such that they have waived their right to life,
Then I am obliged to let the trolley go. However, I have low priors on most humans being so very uninterested in helping others (or at least having an infrastructure to live in) that they wouldn’t be willing to die to save the entire rest of the human species. So if that were really the stake at hand, the lone person tied to the track would have to be loudly announcing “I am a selfish bastard and I’d rather be the last human alive than die to save everyone else in the world!”.
And, again, prudential concerns would probably kick in, most likely well before there were hundreds of people on the line.
Would it be correct to say that, insofar as you would hope that the one person would be willing to sacrifice his/her life for the cause of saving the 5*10^6 others, you yourself would pull the switch and then willingly sacrifice yourself to the death penalty (or whatever penalty there is for murder) for the same cause?
I’d be willing to die (including as part of a legal sentence) to save that many people. (Not that I wouldn’t avoid dying if I could, but if that were a necessary part of the saving-people process I’d still enact said process.) I wouldn’t kill someone I believed unwilling, even for the same purpose, including via trolley.