Forgive me if this is beating a dead horse, or if someone brought up an equivalent problem before; I didn’t see such a thing.
I went through a lot of comments on dust specks vs. torture. (It seems to me like the two sides were miscommunicating in a very specific way, which I may attempt to make clear at some point.) But now I have an example that seems to be equivalent to DSvs.T, easily understandable via my moral intuition and give the “wrong” (i.e., not purely utilitarian) answer.
Suppose I have ten people and a stick. The appropriate infinitely powerful theoretical being offers me a choice. I can hit all ten of them with a stick, or I can hit one of them nine times. “Hitting with a stick” has some constant negative utility for all the people. What do I do?
This seems to me to be exactly dust specks vs. torture scaled down to humanly intuitable scales. I think the obvious answer is to hit all the people once. Examining my intuition tells me that this is because I think the aggregation function for utility is different across different people than across one person’s possible futures. Specifically, my intuition tells me to maximize across people the minimum expected utilty across an individual’s future.
So, is there a name for this position?
Do people think my example is equivalent to DSvsT?
Do people get the same or different answer with this question as they do with DSvsT?
DSvsT was not directly an argument for utilitarianism, it was an argument for tradeoffs and quantitative thinking and against any kind of rigid rules, sacred values, or qualitative thinking which prevents tradeoffs. For any two things, both of which have some nonzero value, there should be some point where you are willing to trade off one for the other—even if one seems wildly less important than the other (like dust specks compared to torture). Utilitarianism provides a specific answer for where that point is, but the DSvsT post didn’t argue for the utilitarian answer, just that the point had to be at less than 3^^^3 dust specks. You would probably have to be convinced of utilitarianism as a theory before accepting its exact answer in this particular case.
The stick-hitting example doesn’t challenge the claim about tradeoffs, since most people are willing to trade off one person getting hit multiple times with many people each getting hit once, with their choice depending on the numbers. In a stadium full of 100,000 people, for instance, it seems better for one person to get hit twice than for everyone to get hit once. Your alternative rule (maximin) doesn’t allow some tradeoffs, so it leads to implausible conclusions in cases like this 100,000x1 vs. 1x2 example.
I don’t think maximising the minima is what you want. Suppose your choice is to hit one person 20 times, or five people 19 times each. Unless your intuition is different from mine, you’ll prefer the first option.
“Hitting with a stick” has some constant negative utility for all the people.
I don’t think you can justifiably expect to be able to tell your brain something this self-evidently unrealistic, and have it update its intuitions accordingly.
I went through a lot of comments on dust specks vs. torture. (It seems to me like the two sides were miscommunicating in a very specific way, which I may attempt to make clear at some point.)
Oh, and I’d love to hear what you mean about this.
There’s one difference, which is that the inequality of the distribution is much more apparent in your example, because one of the options distributes the pain perfectly evenly. If you value equality of distribution as worth more than one unit of pain, it makes sense to choose the equal distribution of pain. This is similar to economic discussions about policies that lead to greater wealth, but greater economic inequality.
I think the point of Dust Specks Vs Torture was scope failure. Even allowing for some sort of “negative marginal utility” once you hit a wacky number 3^^^3, it doesn’t matter. .000001 negative utility point multiplied by 3^^^3 is worse than anything, because 3^^^3 is wacky huge.
For the stick example, I’d say it would have to depend on a lot of factors about human psychology and such, but I think I’d hit the one. Marginal utility tends to go down for a product, and I think that the shock of repeated blows would be less than the shock of the one against ten separate people.
I think your opinion basically is an appeal to egalitarianism, since you expect negative utility to yourself from an unfair world where one person gets something that ten other people did not, for no good or fair reason.
I think you’re mistaken about the marginal utility—being hit again after you’ve already been injured (especially if you’re hit on the same spot) is probably going to be worse than the first blow.
Marginal disutility could plausibly work in the opposite direction from marginal utility.
Each 10% of your money that you lose impacts your quality of life more. Each 10% of money that you gain impacts your quality of life less. There might be threshold effects for both, but I think the direction is right.
I was thinking more along the lines of scope failure: If some one said you were going to be hit 11 times would you really expect it to feel exactly 110% as bad as being hit ten times?
But yes, from a traditional economics point of view, your post makes a hell of a lot more sense. Upvoted.
Marginal utility tends to go done for a product, and I think that the shock of repeated blows would be less than the shock of the one against ten separate people.
Part of the assumption of the problem was that hitting with a stick has some constant negative utility for all the people.
Part of the assumption of the problem was that hitting with a stick has some constant negative utility for all the people.
It’s always hard to think about this sort of thing. I read that in the original problem, but then I ended up thinking about actual hitting people with sticks when deciding what was best. Is there anything in the archives like The True Prisoner’s Dilemma but for giving an intuitive version of problems with adding utility?
Then it depends. If you’re a utilitarian, it is still better to hit the guy nine times than to hit ten people ten times.
If you allow some ideas about the utility of equality, then things get more complicated. That’s why I think most people reject the simple math that 9 < 10.
I’d analyze your question this way. Ask any one of the ten people which they would prefer:
A) to get hit
B) to have a 1/10th chance of getting hit 9 times.
Assuming rationality and constant disutility of getting hit, every one of them would choose B.
Forgive me if this is beating a dead horse, or if someone brought up an equivalent problem before; I didn’t see such a thing.
I went through a lot of comments on dust specks vs. torture. (It seems to me like the two sides were miscommunicating in a very specific way, which I may attempt to make clear at some point.) But now I have an example that seems to be equivalent to DSvs.T, easily understandable via my moral intuition and give the “wrong” (i.e., not purely utilitarian) answer.
Suppose I have ten people and a stick. The appropriate infinitely powerful theoretical being offers me a choice. I can hit all ten of them with a stick, or I can hit one of them nine times. “Hitting with a stick” has some constant negative utility for all the people. What do I do?
This seems to me to be exactly dust specks vs. torture scaled down to humanly intuitable scales. I think the obvious answer is to hit all the people once. Examining my intuition tells me that this is because I think the aggregation function for utility is different across different people than across one person’s possible futures. Specifically, my intuition tells me to maximize across people the minimum expected utilty across an individual’s future.
So, is there a name for this position?
Do people think my example is equivalent to DSvsT?
Do people get the same or different answer with this question as they do with DSvsT?
DSvsT was not directly an argument for utilitarianism, it was an argument for tradeoffs and quantitative thinking and against any kind of rigid rules, sacred values, or qualitative thinking which prevents tradeoffs. For any two things, both of which have some nonzero value, there should be some point where you are willing to trade off one for the other—even if one seems wildly less important than the other (like dust specks compared to torture). Utilitarianism provides a specific answer for where that point is, but the DSvsT post didn’t argue for the utilitarian answer, just that the point had to be at less than 3^^^3 dust specks. You would probably have to be convinced of utilitarianism as a theory before accepting its exact answer in this particular case.
The stick-hitting example doesn’t challenge the claim about tradeoffs, since most people are willing to trade off one person getting hit multiple times with many people each getting hit once, with their choice depending on the numbers. In a stadium full of 100,000 people, for instance, it seems better for one person to get hit twice than for everyone to get hit once. Your alternative rule (maximin) doesn’t allow some tradeoffs, so it leads to implausible conclusions in cases like this 100,000x1 vs. 1x2 example.
I don’t think maximising the minima is what you want. Suppose your choice is to hit one person 20 times, or five people 19 times each. Unless your intuition is different from mine, you’ll prefer the first option.
I don’t think you can justifiably expect to be able to tell your brain something this self-evidently unrealistic, and have it update its intuitions accordingly.
Oh, and I’d love to hear what you mean about this.
There’s one difference, which is that the inequality of the distribution is much more apparent in your example, because one of the options distributes the pain perfectly evenly. If you value equality of distribution as worth more than one unit of pain, it makes sense to choose the equal distribution of pain. This is similar to economic discussions about policies that lead to greater wealth, but greater economic inequality.
I think the point of Dust Specks Vs Torture was scope failure. Even allowing for some sort of “negative marginal utility” once you hit a wacky number 3^^^3, it doesn’t matter. .000001 negative utility point multiplied by 3^^^3 is worse than anything, because 3^^^3 is wacky huge.
For the stick example, I’d say it would have to depend on a lot of factors about human psychology and such, but I think I’d hit the one. Marginal utility tends to go down for a product, and I think that the shock of repeated blows would be less than the shock of the one against ten separate people.
I think your opinion basically is an appeal to egalitarianism, since you expect negative utility to yourself from an unfair world where one person gets something that ten other people did not, for no good or fair reason.
I think you’re mistaken about the marginal utility—being hit again after you’ve already been injured (especially if you’re hit on the same spot) is probably going to be worse than the first blow.
Marginal disutility could plausibly work in the opposite direction from marginal utility.
Each 10% of your money that you lose impacts your quality of life more. Each 10% of money that you gain impacts your quality of life less. There might be threshold effects for both, but I think the direction is right.
I was thinking more along the lines of scope failure: If some one said you were going to be hit 11 times would you really expect it to feel exactly 110% as bad as being hit ten times?
But yes, from a traditional economics point of view, your post makes a hell of a lot more sense. Upvoted.
Part of the assumption of the problem was that hitting with a stick has some constant negative utility for all the people.
It’s always hard to think about this sort of thing. I read that in the original problem, but then I ended up thinking about actual hitting people with sticks when deciding what was best. Is there anything in the archives like The True Prisoner’s Dilemma but for giving an intuitive version of problems with adding utility?
Then it depends. If you’re a utilitarian, it is still better to hit the guy nine times than to hit ten people ten times.
If you allow some ideas about the utility of equality, then things get more complicated. That’s why I think most people reject the simple math that 9 < 10.
I’d analyze your question this way. Ask any one of the ten people which they would prefer: A) to get hit B) to have a 1/10th chance of getting hit 9 times.
Assuming rationality and constant disutility of getting hit, every one of them would choose B.