If you want to know why some particular person called your position ridiculous,
Nobody did; I was replying to the insinuation that the insinuation that it must be ridiculous, regardless of the reasoning.
My own argument/illustration is that for something to be called the ethically right choice, things should work out okay if more people chose it, the more the better. …
That doesn’t work if this is a one-off event, and equating “distributed” with “concentrated” torture requires resolution of the multiperson utility aggregation problem and so would be hard to consider either route ridiculous (as implied by the comment where I entered the thread).
The event doesn’t need to be repeated, the type of event needs to be repeated (whether you’ll choose a minor disutility spread to many, or a large disutility to one). And these type of choices do happen repeatedly, all the time, even though most of them aren’t about absurdly large numbers like 3^^^3 or absurdly small disutilities like a dust speck. Things that our mind isn’t made to handle.
If someone asked you whether it’d be preferable to save a single person from a year’s torture, but in return a billion people would have to get their legs broken—I bet you’d choose to leave the person tortured; because the numbers are a bit more reasonable, and so the actual proper choice is returned by your brain’s intuition...
The event doesn’t need to be repeated, the type of event needs to be repeated (whether you’ll choose a minor disutility spread to many, or a large disutility to one). And these type of choices do happen repeatedly, all the time, even though most of them aren’t about absurdly large numbers like 3^^^3 or absurdly small disutilities like a dust speck.
But that’s assuming they are indeed the same type (that the difference in magnitude does not become a difference in type); and if not, it would make a difference whether or not this choice would in fact generalize.
If someone asked you whether it’d be preferable to save a single person from a year’s torture, but in return a billion people would have to get their legs broken—I bet you’d choose to leave the person tortured;
No, I wouldn’t, and for the same reason I wouldn’t in the dust specks case: the 3^^^3 can collectively buy off the torturee (i.e. provide compensation enough to make the torture preferable given it) if that setup is Pareto-suboptimal, while the reverse is not true.
[EDIT to clarify the above paragraph: if we go with the torture, and it turns out to be pareto-suboptimal, there’s no way the torturee can buy off the 3^^^3 people—it’s a case where willingness to pay collides with the ability to pay (or perhaps, accept). If the torturee, in other words, were offered enough money to buy off the others (not part of the problem), he or she would use the money for such a payment.
In contrast, if we went with the dust specks, and it turned out to be Pareto-suboptimal, then the 3^^^3 could—perhaps by lottery—come up with a way to buy off the torturee and make a Pareto-improvement. Since I would prefer we be in situations that we can Pareto-improve away from vs those that can’t, I prefer the dust specks.
Moreover, increasing the severity of the disutility that the 3^^^3 get—say, to broken legs, random murder, etc—does not change this conclusion; it just increases the consumer surplus (or decreases the consumer “deficit”) from buying off the torturee. /end EDIT]
Whatever error I’ve made here does not appear to stem from “poor handling of large numbers”, the ostensible point of the example.
Nobody did; I was replying to the insinuation that the insinuation that it must be ridiculous, regardless of the reasoning.
That doesn’t work if this is a one-off event, and equating “distributed” with “concentrated” torture requires resolution of the multiperson utility aggregation problem and so would be hard to consider either route ridiculous (as implied by the comment where I entered the thread).
The event doesn’t need to be repeated, the type of event needs to be repeated (whether you’ll choose a minor disutility spread to many, or a large disutility to one). And these type of choices do happen repeatedly, all the time, even though most of them aren’t about absurdly large numbers like 3^^^3 or absurdly small disutilities like a dust speck. Things that our mind isn’t made to handle.
If someone asked you whether it’d be preferable to save a single person from a year’s torture, but in return a billion people would have to get their legs broken—I bet you’d choose to leave the person tortured; because the numbers are a bit more reasonable, and so the actual proper choice is returned by your brain’s intuition...
But that’s assuming they are indeed the same type (that the difference in magnitude does not become a difference in type); and if not, it would make a difference whether or not this choice would in fact generalize.
No, I wouldn’t, and for the same reason I wouldn’t in the dust specks case: the 3^^^3 can collectively buy off the torturee (i.e. provide compensation enough to make the torture preferable given it) if that setup is Pareto-suboptimal, while the reverse is not true.
[EDIT to clarify the above paragraph: if we go with the torture, and it turns out to be pareto-suboptimal, there’s no way the torturee can buy off the 3^^^3 people—it’s a case where willingness to pay collides with the ability to pay (or perhaps, accept). If the torturee, in other words, were offered enough money to buy off the others (not part of the problem), he or she would use the money for such a payment.
In contrast, if we went with the dust specks, and it turned out to be Pareto-suboptimal, then the 3^^^3 could—perhaps by lottery—come up with a way to buy off the torturee and make a Pareto-improvement. Since I would prefer we be in situations that we can Pareto-improve away from vs those that can’t, I prefer the dust specks.
Moreover, increasing the severity of the disutility that the 3^^^3 get—say, to broken legs, random murder, etc—does not change this conclusion; it just increases the consumer surplus (or decreases the consumer “deficit”) from buying off the torturee. /end EDIT]
Whatever error I’ve made here does not appear to stem from “poor handling of large numbers”, the ostensible point of the example.