As fubarobfusco pointed out, your argument includes the implication that discovering or publicizing unpleasant truths can be morally wrong (because the participants were ignorant in the original formulation). It’s not obvious to me that any moral theory is committed to that position.
And without that moral conclusion, I think Eliezer is correct that a total utilitarian is committed to believing that choosing TORTURE over SPECKS maximizes total utility. The repugnant conclusion really is that repugnant. All of that was not an obvious result to me.
Any utility function that does not give an explicit overwhelmingly positive value to truth, and does give an explicit positive value to “pleasure” would obviously include the implication that discovering or publicizing unpleasant truths can be morally wrong. I don’t see why it is relevant.
If all the utilities are specified by the problem text completely, then TORTURE maximizes the total utility by definition. There’s nothing to be committed about. But in this case, “torture” is just a label. It cannot refer to a real torture, because a real torture would produce different utility changes for people.
As fubarobfusco pointed out, your argument includes the implication that discovering or publicizing unpleasant truths can be morally wrong (because the participants were ignorant in the original formulation). It’s not obvious to me that any moral theory is committed to that position.
And without that moral conclusion, I think Eliezer is correct that a total utilitarian is committed to believing that choosing TORTURE over SPECKS maximizes total utility. The repugnant conclusion really is that repugnant. All of that was not an obvious result to me.
Any utility function that does not give an explicit overwhelmingly positive value to truth, and does give an explicit positive value to “pleasure” would obviously include the implication that discovering or publicizing unpleasant truths can be morally wrong. I don’t see why it is relevant.
If all the utilities are specified by the problem text completely, then TORTURE maximizes the total utility by definition. There’s nothing to be committed about. But in this case, “torture” is just a label. It cannot refer to a real torture, because a real torture would produce different utility changes for people.