Wait, what was the conclusion of dust specks? I’m guess “torture”, but then, why is this conclusion so strong and obvious (after the fact)? I had always been on the dust specks side, for a few reasons, but I’d like to know why this position is so ridiculous, and I don’t know even despite having participated in those threads.
The problem attempts to define the situation so that “torture” is utility maximizing. Therefore if you are a utility maximizer, “torture” is the implied choice. The problem is meant to illustrate that in extreme cases utility maximization can (rightly or wrongly) lead to decisions that are counter-intuitive to our limited human imaginations.
For me, the sum of all the pains isn’t a good measure for the dreadfulness of a situation. The maximal pain is a better one.
It’s worse to break the two legs of a single man than to break one leg each of seven billion people?
If a genie forced you to choose between the two options, would you really prefer the latter scenario?
Ask yourself, in which world would you want to be in all the roles.
I’m sorry, but I really can’t imagine the size of 3^^^3. So I really can’t answer this question by trying to imagine myself filling all those roles. My imagination just fails at that point. And if anyone here thinks they can imagine it, I think they’re deluding themselves.
But if anyone wants to try, I’d like to remind them that in a random sample there’d probably be innumerable quintillions of people that would already be getting tortured for life one way or another. You’re not removing all that torture if you vote against torturing a single person more.
It’s worse to break the two legs of a single man than to break one leg each of seven billion people?
First, I would eliminate two leg breaking. Second, one leg breaking.
Of course, an epidemic one leg breaking would have othere severe effects like starvation to death and alike. What should come even before two broken legs.
In a clean abstract world of just a broken leg or two per person, with no further implications, the maximal pain is stil the first to be eliminated, if you ask me.
From behind the veil of ignorance, would you rather have a 100% chance of one broken leg, or a 1⁄7,000,000,000 chance of two broken legs and 6,999,999,999⁄7,000,000,000 chance of being unharmed?
But I would choose one broken leg, if that would mean that the total amount of two broken legs would go to zero then.
Creatures somewhere in existence are going to face death and severe harm for the foreseeable future. This view then seems inert.
In another words. I would vaccinate everybody (the vaccination causes discomfort) to eliminate a deadly disease like Ebola which kills few.
What would you do?
There are enough minor threats with expensive countermeasures (more expensive as higher reliability is demanded) that this approach would devour all available wealth. It would bar us from, e.g. traveling for entertainment (risk of death exists whether we walk, drive, or fly). I wouldn’t want that tradeoff for society or for myself.
I would endorse choosing a broken leg for one person if that guaranteed that nobody in the world had two broken legs, certainly. This seems to have drifted rather far from the original problem statement.
I would also vaccinate a few billion people to avoid a few hundred deaths/year, if the vaccination caused no negative consequences beyond mild discomfort (e.g., no chance of a fatal allergic reaction to the vaccine, no chance of someone starving to death for lack of the resources that went towards vaccination, etc).
I’m not sure I would vaccinate a few billion people to avoid a dozen deaths though… maybe, maybe not. I suspect it depends on how much I value the people involved.
I probably wouldn’t vaccinate a few billion people to avoid a .000001 chance of someone dying. Though if I assume that people normally live a few million years instead of a few dozen, I might change my mind. I’m not sure though… it’s hard to estimate with real numbers in such an implausible scenario; my intuitions about real scenarios (with opportunity costs, knock-on effects, etc.) keep interfering.
Which doesn’t change my belief that scale matters. Breaking one person’s leg is preferable to breaking two people’s legs. Breaking both of one person’s legs is preferable to breaking one of a million people’s legs.
In another words. I would vaccinate everybody (the vaccination causes discomfort) to eliminate a deadly disease like Ebola which kills few.
What would you do?
I don’t think you understand the logic behind the anti-speckers’s choice. It isn’t that we always oppose the greater number of minor disutilities. It’s that we believe that there’s an actual judgment to be made given the specific disutilities and numbers involved—you on the other hand just ignore the numbers involved altogether.
I would vaccinate everyone to eradicate Ebola which kills few. But I would not vaccinate everyone to eradicate a different disease that mildly discomforts few only slightly more so than the vaccination process itself.
Agreed that introducing knock-on effects (starvation and so forth) is significantly changing the scenario. I endorse ignoring that.
Given seven billion one-legged people and one zero-legged person, and the ability to wave a magic wand and cure either the zero-legged person or the 6,999,999,999 one-legged people, I heal the one-legged people.
That’s true even if I have the two broken legs. That’s true even if I will get to heal the other set later (as is implied by your use of the word “first”).
If I’ve understood you correctly, you commit to using the wand to healing my legs instead of healing everyone else.
If that’s true, I will do my best to keep that wand out of your hands.
I would devote an amount of energy to avoiding that scenario that seemed commensurate with its expected value. Indeed, I’m doing so right now (EDIT: actually, on consideration, I’m devoting far more energy to it than it merits). If my estimate of the likelihood of you obtaining such a wand (and, presumably, finding the one person in the world who is suffering incrementally more than anyone else and alleviating his or her suffering with it) increases, the amount of energy I devote to avoiding it might also increase.
Different people had different answers. Eliezer was in favor of torture. I am likewise. Others were in favor of the dust specks.
but I’d like to know why this position is so ridiculous
If you want to know why some particular person called your position ridiculous, perhaps you should ask whatever particular person so called it.
My own argument/illustration is that for something to be called the ethically right choice, things should work out okay if more people chose it, the more the better. But in this case, if a billion people chose dust-specks or the equivalent thereof, then whole vast universes would be effectively tortured. A billion tortures would be tragic, but it pales in comparison to a whole universe getting tortured.
Therefore dust-specks is not a universalizable choice, therefore it’s not the ethically right choice.
If you want to know why some particular person called your position ridiculous,
Nobody did; I was replying to the insinuation that the insinuation that it must be ridiculous, regardless of the reasoning.
My own argument/illustration is that for something to be called the ethically right choice, things should work out okay if more people chose it, the more the better. …
That doesn’t work if this is a one-off event, and equating “distributed” with “concentrated” torture requires resolution of the multiperson utility aggregation problem and so would be hard to consider either route ridiculous (as implied by the comment where I entered the thread).
The event doesn’t need to be repeated, the type of event needs to be repeated (whether you’ll choose a minor disutility spread to many, or a large disutility to one). And these type of choices do happen repeatedly, all the time, even though most of them aren’t about absurdly large numbers like 3^^^3 or absurdly small disutilities like a dust speck. Things that our mind isn’t made to handle.
If someone asked you whether it’d be preferable to save a single person from a year’s torture, but in return a billion people would have to get their legs broken—I bet you’d choose to leave the person tortured; because the numbers are a bit more reasonable, and so the actual proper choice is returned by your brain’s intuition...
The event doesn’t need to be repeated, the type of event needs to be repeated (whether you’ll choose a minor disutility spread to many, or a large disutility to one). And these type of choices do happen repeatedly, all the time, even though most of them aren’t about absurdly large numbers like 3^^^3 or absurdly small disutilities like a dust speck.
But that’s assuming they are indeed the same type (that the difference in magnitude does not become a difference in type); and if not, it would make a difference whether or not this choice would in fact generalize.
If someone asked you whether it’d be preferable to save a single person from a year’s torture, but in return a billion people would have to get their legs broken—I bet you’d choose to leave the person tortured;
No, I wouldn’t, and for the same reason I wouldn’t in the dust specks case: the 3^^^3 can collectively buy off the torturee (i.e. provide compensation enough to make the torture preferable given it) if that setup is Pareto-suboptimal, while the reverse is not true.
[EDIT to clarify the above paragraph: if we go with the torture, and it turns out to be pareto-suboptimal, there’s no way the torturee can buy off the 3^^^3 people—it’s a case where willingness to pay collides with the ability to pay (or perhaps, accept). If the torturee, in other words, were offered enough money to buy off the others (not part of the problem), he or she would use the money for such a payment.
In contrast, if we went with the dust specks, and it turned out to be Pareto-suboptimal, then the 3^^^3 could—perhaps by lottery—come up with a way to buy off the torturee and make a Pareto-improvement. Since I would prefer we be in situations that we can Pareto-improve away from vs those that can’t, I prefer the dust specks.
Moreover, increasing the severity of the disutility that the 3^^^3 get—say, to broken legs, random murder, etc—does not change this conclusion; it just increases the consumer surplus (or decreases the consumer “deficit”) from buying off the torturee. /end EDIT]
Whatever error I’ve made here does not appear to stem from “poor handling of large numbers”, the ostensible point of the example.
Wait, what was the conclusion of dust specks? I’m guess “torture”, but then, why is this conclusion so strong and obvious (after the fact)? I had always been on the dust specks side, for a few reasons, but I’d like to know why this position is so ridiculous, and I don’t know even despite having participated in those threads.
The problem attempts to define the situation so that “torture” is utility maximizing. Therefore if you are a utility maximizer, “torture” is the implied choice. The problem is meant to illustrate that in extreme cases utility maximization can (rightly or wrongly) lead to decisions that are counter-intuitive to our limited human imaginations.
For me, the sum of all the pains isn’t a good measure for the dreadfulness of a situation. The maximal pain is a better one.
But I don’t think it is more than a preference. It is may preference only. Like a strawberry is better than a blueberry.
For my taste, the dust specks for everybody is better than a horrible torture for just one.
Ask yourself, in which world would you want to be in all the roles.
It’s worse to break the two legs of a single man than to break one leg each of seven billion people?
If a genie forced you to choose between the two options, would you really prefer the latter scenario?
I’m sorry, but I really can’t imagine the size of 3^^^3. So I really can’t answer this question by trying to imagine myself filling all those roles. My imagination just fails at that point. And if anyone here thinks they can imagine it, I think they’re deluding themselves.
But if anyone wants to try, I’d like to remind them that in a random sample there’d probably be innumerable quintillions of people that would already be getting tortured for life one way or another. You’re not removing all that torture if you vote against torturing a single person more.
First, I would eliminate two leg breaking. Second, one leg breaking.
Of course, an epidemic one leg breaking would have othere severe effects like starvation to death and alike. What should come even before two broken legs.
In a clean abstract world of just a broken leg or two per person, with no further implications, the maximal pain is stil the first to be eliminated, if you ask me.
From behind the veil of ignorance, would you rather have a 100% chance of one broken leg, or a 1⁄7,000,000,000 chance of two broken legs and 6,999,999,999⁄7,000,000,000 chance of being unharmed?
I would opt for two broken legs with a small probability, of course. In your scenario.
But I would choose one broken leg, if that would mean that the total amount of two broken legs would go to zero then.
In another words. I would vaccinate everybody (the vaccination causes discomfort) to eliminate a deadly disease like Ebola which kills few.
What would you do?
Creatures somewhere in existence are going to face death and severe harm for the foreseeable future. This view then seems inert.
There are enough minor threats with expensive countermeasures (more expensive as higher reliability is demanded) that this approach would devour all available wealth. It would bar us from, e.g. traveling for entertainment (risk of death exists whether we walk, drive, or fly). I wouldn’t want that tradeoff for society or for myself.
I would endorse choosing a broken leg for one person if that guaranteed that nobody in the world had two broken legs, certainly. This seems to have drifted rather far from the original problem statement.
I would also vaccinate a few billion people to avoid a few hundred deaths/year, if the vaccination caused no negative consequences beyond mild discomfort (e.g., no chance of a fatal allergic reaction to the vaccine, no chance of someone starving to death for lack of the resources that went towards vaccination, etc).
I’m not sure I would vaccinate a few billion people to avoid a dozen deaths though… maybe, maybe not. I suspect it depends on how much I value the people involved.
I probably wouldn’t vaccinate a few billion people to avoid a .000001 chance of someone dying. Though if I assume that people normally live a few million years instead of a few dozen, I might change my mind. I’m not sure though… it’s hard to estimate with real numbers in such an implausible scenario; my intuitions about real scenarios (with opportunity costs, knock-on effects, etc.) keep interfering.
Which doesn’t change my belief that scale matters. Breaking one person’s leg is preferable to breaking two people’s legs. Breaking both of one person’s legs is preferable to breaking one of a million people’s legs.
I don’t think you understand the logic behind the anti-speckers’s choice. It isn’t that we always oppose the greater number of minor disutilities. It’s that we believe that there’s an actual judgment to be made given the specific disutilities and numbers involved—you on the other hand just ignore the numbers involved altogether.
I would vaccinate everyone to eradicate Ebola which kills few. But I would not vaccinate everyone to eradicate a different disease that mildly discomforts few only slightly more so than the vaccination process itself.
The logic is: Integrate two evils through time and eliminate that which has a bigger integral!
I just don’t agree with it.
May I ask if you consider yourself a deontologist, a consequentialist, or something else?
Agreed that introducing knock-on effects (starvation and so forth) is significantly changing the scenario. I endorse ignoring that.
Given seven billion one-legged people and one zero-legged person, and the ability to wave a magic wand and cure either the zero-legged person or the 6,999,999,999 one-legged people, I heal the one-legged people.
That’s true even if I have the two broken legs.
That’s true even if I will get to heal the other set later (as is implied by your use of the word “first”).
If I’ve understood you correctly, you commit to using the wand to healing my legs instead of healing everyone else.
If that’s true, I will do my best to keep that wand out of your hands.
So, you would do everything you can, to prevent a small probability, but very bad scenario? Wouldn’t you just neglect it?
I would devote an amount of energy to avoiding that scenario that seemed commensurate with its expected value. Indeed, I’m doing so right now (EDIT: actually, on consideration, I’m devoting far more energy to it than it merits). If my estimate of the likelihood of you obtaining such a wand (and, presumably, finding the one person in the world who is suffering incrementally more than anyone else and alleviating his or her suffering with it) increases, the amount of energy I devote to avoiding it might also increase.
Different people had different answers. Eliezer was in favor of torture. I am likewise. Others were in favor of the dust specks.
If you want to know why some particular person called your position ridiculous, perhaps you should ask whatever particular person so called it.
My own argument/illustration is that for something to be called the ethically right choice, things should work out okay if more people chose it, the more the better. But in this case, if a billion people chose dust-specks or the equivalent thereof, then whole vast universes would be effectively tortured. A billion tortures would be tragic, but it pales in comparison to a whole universe getting tortured.
Therefore dust-specks is not a universalizable choice, therefore it’s not the ethically right choice.
Nobody did; I was replying to the insinuation that the insinuation that it must be ridiculous, regardless of the reasoning.
That doesn’t work if this is a one-off event, and equating “distributed” with “concentrated” torture requires resolution of the multiperson utility aggregation problem and so would be hard to consider either route ridiculous (as implied by the comment where I entered the thread).
The event doesn’t need to be repeated, the type of event needs to be repeated (whether you’ll choose a minor disutility spread to many, or a large disutility to one). And these type of choices do happen repeatedly, all the time, even though most of them aren’t about absurdly large numbers like 3^^^3 or absurdly small disutilities like a dust speck. Things that our mind isn’t made to handle.
If someone asked you whether it’d be preferable to save a single person from a year’s torture, but in return a billion people would have to get their legs broken—I bet you’d choose to leave the person tortured; because the numbers are a bit more reasonable, and so the actual proper choice is returned by your brain’s intuition...
But that’s assuming they are indeed the same type (that the difference in magnitude does not become a difference in type); and if not, it would make a difference whether or not this choice would in fact generalize.
No, I wouldn’t, and for the same reason I wouldn’t in the dust specks case: the 3^^^3 can collectively buy off the torturee (i.e. provide compensation enough to make the torture preferable given it) if that setup is Pareto-suboptimal, while the reverse is not true.
[EDIT to clarify the above paragraph: if we go with the torture, and it turns out to be pareto-suboptimal, there’s no way the torturee can buy off the 3^^^3 people—it’s a case where willingness to pay collides with the ability to pay (or perhaps, accept). If the torturee, in other words, were offered enough money to buy off the others (not part of the problem), he or she would use the money for such a payment.
In contrast, if we went with the dust specks, and it turned out to be Pareto-suboptimal, then the 3^^^3 could—perhaps by lottery—come up with a way to buy off the torturee and make a Pareto-improvement. Since I would prefer we be in situations that we can Pareto-improve away from vs those that can’t, I prefer the dust specks.
Moreover, increasing the severity of the disutility that the 3^^^3 get—say, to broken legs, random murder, etc—does not change this conclusion; it just increases the consumer surplus (or decreases the consumer “deficit”) from buying off the torturee. /end EDIT]
Whatever error I’ve made here does not appear to stem from “poor handling of large numbers”, the ostensible point of the example.