One of your original criticisms of the choice of torture instead of specks was that that choice assumed very particular consequences of actions—that torturing wouldn’t ever affect future choices to torture. However, you assume that it would always affect future choices to torture by making it more likely. Both of these assumptions are too extreme for the real world, though fine for hypotheticals in which other questions—such as aggregation of utility—are the subject.
Arguing that something would usually or often happen doesn’t undermine the original thought experiment in which that wasn’t one of the variables. In practice, I’m happy to say that for some small amount of pain and some number of people, inflicting more pain per person on fewer people is preferable, but those numbers depend on other consequences of the choice. If in practice every choice made to cause more pain to fewer people when it is not the first week of December, GMT, causes a plague somewhere, that affects the calculus. Sometimes it will be the first week of December, and in any case “some number of people” is not fixed and can be different depending on the week, etc.
if we state that torture and dustspecks have exactly equal amounts of direct suffering, we should still obviously choose the specks.
If inflicting x pain on Q people for t1 time directly causes the same amount of suffering as inflicting y pain on R people for t2 time, and inflicting x pain on Q people for t1 time indirectly causes more suffering than inflicting y pain on R people for t2 time, we prefer the first option. That doesn’t undermine any utilitarianism or make one question the coherence of aggregating suffering.
At a certain point human beings will—from sheer necessity for psychological stability—engage in the suspension of moral belief. “One person dying is a tragedy; a thousand is a statistic; a million is a number.”
Teenage Mugger: [Dundee and Sue are approached by a black youth stepping out from the shadows, followed by some others] You got a light, buddy? Michael J. “Crocodile” Dundee: Yeah, sure kid. [reaches for lighter] Teenage Mugger: [flicks open a switchmillion] And your wallet! Sue Charlton: [guardedly] Mick, give him your wallet. Michael J. “Crocodile” Dundee: [amused] What for? Sue Charlton: [cautiously] He’s got a large number. Michael J. “Crocodile” Dundee: [chuckles] That’s not a large number. [he pulls out a large Bowie 3^^^^3] Michael J. “Crocodile” Dundee: THAT’s a large number. [Dundee slashes the teen mugger’s jacket and maintains eyeball to eyeball stare] Teenage Mugger: Shit!
One of your original criticisms of the choice of torture instead of specks was that that choice assumed very particular consequences of actions—that torturing wouldn’t ever affect future choices to torture.
This is the exact opposite of a true statement about my original criticisms.
However, you assume that it would always affect future choices to torture by making it more likely.
Ceteris paribus, yes. All other things being equal, consciously selecting torture and then carrying it out will, in fact, make future tortures more likely. Under the assertions of the empirical research associated with the Broken Window Theory, this is not merely an assumption, it’s a fact. (In other words, my assumption is that the experiments on the topic allow for valid predictions in this question.)
Arguing that something would usually or often happen doesn’t undermine the original thought experiment in which that wasn’t one of the variables.
I’m sorry, consequentialism doesn’t work that way. Consequences of a choice are consequences of a choice. This is a tautology. When comparing the utilitarian consequences of a given choice, all utility-affecting consequences must be considered.
Furthermore, I do not understand why you would phrase this in terms of “undermining the original thought experiment”. Certainly, I’m undermining Eliezer’s conclusion of the experiment—and those who agree with him. But that’s hardly equivalent to undermining the experiment itself.
I’m arguing you are wrong to choose “torture”. Not that the experiment is invalid.
If inflicting x pain on Q people for t1 time directly causes the same amount of suffering as inflicting y pain on R people for t2 time, and inflicting x pain on Q people for t1 time indirectly causes more suffering than inflicting y pain on R people for t2 time, we prefer the first option.
Say the value of direct disutility is d(X). We here stipulate that d(torture) and d(speck) are equal. Say that the indirect disutility is i(X). We here stipulate that i(torture) > i(speck). We have also stipulated that we are using identical units for disutility. d(torture)+i(torture) > d(speck)+i(speck), yet we prefer torture? I am going to choose to believe that by “prefer” you mean to say that you prefer to say that torture is the worse outcome. I believe your skills as a rationalist exceed the possibility of you intentionally saying the opposite.
That doesn’t undermine any utilitarianism or make one question the coherence of aggregating suffering.
I never even remotely suggested either of these things were notions worthy of consideration. Why bring them up?
-- “Crocodile” Dundee, alternate universe
I’m not quite sure what you were saying here, but I know it was funny as hell. :-)
One of your original criticisms of the choice of torture instead of specks was that that choice assumed very particular consequences of actions—that torturing wouldn’t ever affect future choices to torture. However, you assume that it would always affect future choices to torture by making it more likely. Both of these assumptions are too extreme for the real world, though fine for hypotheticals in which other questions—such as aggregation of utility—are the subject.
Arguing that something would usually or often happen doesn’t undermine the original thought experiment in which that wasn’t one of the variables. In practice, I’m happy to say that for some small amount of pain and some number of people, inflicting more pain per person on fewer people is preferable, but those numbers depend on other consequences of the choice. If in practice every choice made to cause more pain to fewer people when it is not the first week of December, GMT, causes a plague somewhere, that affects the calculus. Sometimes it will be the first week of December, and in any case “some number of people” is not fixed and can be different depending on the week, etc.
If inflicting x pain on Q people for t1 time directly causes the same amount of suffering as inflicting y pain on R people for t2 time, and inflicting x pain on Q people for t1 time indirectly causes more suffering than inflicting y pain on R people for t2 time, we prefer the first option. That doesn’t undermine any utilitarianism or make one question the coherence of aggregating suffering.
Teenage Mugger: [Dundee and Sue are approached by a black youth stepping out from the shadows, followed by some others] You got a light, buddy?
Michael J. “Crocodile” Dundee: Yeah, sure kid.
[reaches for lighter]
Teenage Mugger: [flicks open a switchmillion] And your wallet!
Sue Charlton: [guardedly] Mick, give him your wallet.
Michael J. “Crocodile” Dundee: [amused] What for?
Sue Charlton: [cautiously] He’s got a large number.
Michael J. “Crocodile” Dundee: [chuckles] That’s not a large number.
[he pulls out a large Bowie 3^^^^3]
Michael J. “Crocodile” Dundee: THAT’s a large number.
[Dundee slashes the teen mugger’s jacket and maintains eyeball to eyeball stare]
Teenage Mugger: Shit!
--”Crocodile” Dundee, alternate universe
This is the exact opposite of a true statement about my original criticisms.
Ceteris paribus, yes. All other things being equal, consciously selecting torture and then carrying it out will, in fact, make future tortures more likely. Under the assertions of the empirical research associated with the Broken Window Theory, this is not merely an assumption, it’s a fact. (In other words, my assumption is that the experiments on the topic allow for valid predictions in this question.)
I’m sorry, consequentialism doesn’t work that way. Consequences of a choice are consequences of a choice. This is a tautology. When comparing the utilitarian consequences of a given choice, all utility-affecting consequences must be considered.
Furthermore, I do not understand why you would phrase this in terms of “undermining the original thought experiment”. Certainly, I’m undermining Eliezer’s conclusion of the experiment—and those who agree with him. But that’s hardly equivalent to undermining the experiment itself.
I’m arguing you are wrong to choose “torture”. Not that the experiment is invalid.
Say the value of direct disutility is d(X). We here stipulate that d(torture) and d(speck) are equal. Say that the indirect disutility is i(X). We here stipulate that i(torture) > i(speck). We have also stipulated that we are using identical units for disutility. d(torture)+i(torture) > d(speck)+i(speck), yet we prefer torture? I am going to choose to believe that by “prefer” you mean to say that you prefer to say that torture is the worse outcome. I believe your skills as a rationalist exceed the possibility of you intentionally saying the opposite.
I never even remotely suggested either of these things were notions worthy of consideration. Why bring them up?
I’m not quite sure what you were saying here, but I know it was funny as hell. :-)