Eisegates, is there no limit to the number of people you would subject to a punch in the face (very painful but temporary with no risk of death) in order to avoid the torture of one person? What if you personally had to do (at least some of) the punching? I agree that I might not be willing to personally commit the torture despite the terrible (aggregate) harm my refusal would bring, but I’m not proud of that fact—it seems selfish to me. And extrapolating your position seems to justify pretty terrible acts. It seems to me that the punch is equivalent to some very small amount of torture.
1. My intuition on this point is very insensitve to scale. You could put a googol of persons in the galaxy, and faced with a choice between torturing one of them and causing them all to take one shot, I’d probably choose the punches.
2. Depends how much punching I had to do. I’d happily punch a hundred people, and let others do the rest of the work, to keep one stranger from getting tortured for the rest of his life. Make it one of my loved ones at risk of torture, and I would punch people until the day I die (presumably, I would be given icepacks from time to time for my hand).
3. Extrapolating is risky business with ethical intuitions. Change the facts and my intuition might change, too. I think that, in general, ethical intuitions are highly complex products of social forces that do not reduce well to abstract moral theories—either of the deontological or utilitarian variety. And to the extent that moral statements are cognitive, I think they are referring to these sort of sociological facts—meaning that any abstract theory will end up incorporating a lot of error along the way.
Would you really feel selfish in the dust speck scenario? I think, at the end of the day, I’d feel pretty good about the whole thing.
Unknown: 10 years and I would leave the lever alone, no doubt. 1 day is a very hard call; probably I would pull the lever. Most of us could get over 1 day over torture in a way that is fundamentally different from years of torture, after all.
Perhaps you can defend one punch per human being, but there must be some number of human beings for whom one punch each would outweigh torture.
As I said, I don’t have that intuition. A punch is a fairly trivial harm. I doubt I would ever feel worse about a lot of people (even 5^^^^^^5) getting punched than about a single individual being tortured for a lifetime. Sorry—I am just not very aggregative when it comes to these sorts of attitudes.
Is that “irrational?” Frankly, I’m not sure the word applies in the sense you mean. It is inconsistent with most accounts of strict utilitarianism. But I don’t agree that abstract ethical theories have truth values in the sense you probably assume. It is consistent with my attitudes and preferences, and with my society’s attitudes and preferences, I think. You assume that we should be able to add those attitudes up and do math with them, but I don’t see why that should necessarily be the case.
I think the difference is that you are assuming (at least in a very background sort of way) that there are non-natural, mind-independent, moral facts somehow engrafted onto the structure of reality. You feel like those entities should behave like physical entities, however, in being subject to the sort of mathematical relations we have developed based upon our interactions with real-world entities (even if those relations are now used abstractly). Even if you could make a strong argument for the existence of these sorts of moral rules, however, that is a far sight from saying that they should have an internal structure that behaves in a mathematically-tidy way.
You haven’t ever given reasons to think that ethical truths ought to obey mathematical rules; you’ve just assumed it. It’s easy to miss this assumption unless you’ve spent some time mulling over moral ontology, but it definitely animates most of the arguments made in this thread.
In short: unless you’ve grappled seriously with what you mean when you talk of moral rules, you have very little basis for assuming that you should be able to do sums with them. Is 6 billion punches for everyone “worse than” 50 years of torture for one person? It certainly involves the firing of more pain neurons. But the fact that a number of pain neurons fire is just a fact about the world; it isn’t the answer to a moral question, UNLESS you make a large number of assumptions. I agree that we can count neuron-firings, and do sums with them, and all other sorts of things. I just disagree that the firing of pain and pleasure neurons is the sum total of what we mean when we say “it was wrong of Fred to murder Sally.”