If you don’t accept the additivity of harm, you accept that for any harm x, there is some number of people y where 2^y people suffering x harm is the same welfare wise as y people suffering x harm.
(Not to mention that when normalized across people, utils are meant to provide direct and simple mathematical comparisons. In this case, it doesn’t really matter how the normalization occurs as the inequality stands for any epsilon of dust-speck harm greater than zero.)
Polling people to find if they will take a dust speck grants an external harm to the torture (e.g., mental distress at the thought of someone being tortured). Since they would prefer the dust speck, this indicates that they find the thought of someone being subject to 50 years of torture (Harm a) to be of more harm than a dust speck (Harm b). Harm a > Harm b, so n Harm a > n Harm b, and it doesn’t even matter what Harm a or Harm b is, nor the additional nondistributed harm of the actual torture.
How could this be tricked? Replace the dust speck with the least harm greater than the distress at the thought of someone being tortured for 50 years, say, the thought of someone being tortured for 51 years.
If you don’t accept the additivity of harm, you accept that for any harm x, there is some number of people y where 2^y people suffering x harm is the same welfare wise as y people suffering x harm.
(Not to mention that when normalized across people, utils are meant to provide direct and simple mathematical comparisons. In this case, it doesn’t really matter how the normalization occurs as the inequality stands for any epsilon of dust-speck harm greater than zero.)
Polling people to find if they will take a dust speck grants an external harm to the torture (e.g., mental distress at the thought of someone being tortured). Since they would prefer the dust speck, this indicates that they find the thought of someone being subject to 50 years of torture (Harm a) to be of more harm than a dust speck (Harm b). Harm a > Harm b, so n Harm a > n Harm b, and it doesn’t even matter what Harm a or Harm b is, nor the additional nondistributed harm of the actual torture. How could this be tricked? Replace the dust speck with the least harm greater than the distress at the thought of someone being tortured for 50 years, say, the thought of someone being tortured for 51 years.