I’m predicting that your thinking trended toward a particular uncomfortable answer, for at least one fraction of a second before you started finding reasons to question the dilemma itself.
It certainly happened to me. And it took considerable inner (and outer) digging to end up confirming the no-torture choice, as opposed to switching to the “logical” one. In the end, I could not adequately resolve for myself a question “How confident am I that EY’s logic of torture is airtight? Confident enough to inflict torture?”. So I flinched again and decided that for this particular problem I am not willing to chance my decision on something I could not assign satisfactorily high probability to.
Interestingly, I found the other classic dilemma, personally killing one person to save several, perfectly clear-cut, since there are no uncertainties there, only uncomfortable, but necessary actions.
nterestingly, I found the other classic dilemma, personally killing one person to save several, perfectly clear-cut, since there are no uncertainties there, only uncomfortable, but necessary actions.
Assuming this is a reference to the trolley problem—is the least convenient possible world the one where you yourself are large enough to stop the trolley, or the one where you are not? Because there definitely is uncertainty—you could sacrifice yourself instead.
Also, If someone else tried to toss you in front of a trolley to save 10 other people, would you resist?
I would resist, too. I assign a very large positive utility to myself, don’t know about you guys. My immediate family would be the only ones in the ballpark.
It certainly happened to me. And it took considerable inner (and outer) digging to end up confirming the no-torture choice, as opposed to switching to the “logical” one. In the end, I could not adequately resolve for myself a question “How confident am I that EY’s logic of torture is airtight? Confident enough to inflict torture?”. So I flinched again and decided that for this particular problem I am not willing to chance my decision on something I could not assign satisfactorily high probability to.
Interestingly, I found the other classic dilemma, personally killing one person to save several, perfectly clear-cut, since there are no uncertainties there, only uncomfortable, but necessary actions.
I wonder if this is a standard way of thinking.
Assuming this is a reference to the trolley problem—is the least convenient possible world the one where you yourself are large enough to stop the trolley, or the one where you are not? Because there definitely is uncertainty—you could sacrifice yourself instead.
Also, If someone else tried to toss you in front of a trolley to save 10 other people, would you resist?
I don’t know about shminux, but I sure as heck would. This is definitely a case of -wanting/-liking/+approving.
I would resist, too. I assign a very large positive utility to myself, don’t know about you guys. My immediate family would be the only ones in the ballpark.