Unknown: 10 years and I would leave the lever alone, no doubt. 1 day is a very hard call; probably I would pull the lever. Most of us could get over 1 day over torture in a way that is fundamentally different from years of torture, after all.
Perhaps you can defend one punch per human being, but there must be some number of human beings for whom one punch each would outweigh torture.
As I said, I don’t have that intuition. A punch is a fairly trivial harm. I doubt I would ever feel worse about a lot of people (even 5^^^^^^5) getting punched than about a single individual being tortured for a lifetime. Sorry—I am just not very aggregative when it comes to these sorts of attitudes.
Is that “irrational?” Frankly, I’m not sure the word applies in the sense you mean. It is inconsistent with most accounts of strict utilitarianism. But I don’t agree that abstract ethical theories have truth values in the sense you probably assume. It is consistent with my attitudes and preferences, and with my society’s attitudes and preferences, I think. You assume that we should be able to add those attitudes up and do math with them, but I don’t see why that should necessarily be the case.
I think the difference is that you are assuming (at least in a very background sort of way) that there are non-natural, mind-independent, moral facts somehow engrafted onto the structure of reality. You feel like those entities should behave like physical entities, however, in being subject to the sort of mathematical relations we have developed based upon our interactions with real-world entities (even if those relations are now used abstractly). Even if you could make a strong argument for the existence of these sorts of moral rules, however, that is a far sight from saying that they should have an internal structure that behaves in a mathematically-tidy way.
You haven’t ever given reasons to think that ethical truths ought to obey mathematical rules; you’ve just assumed it. It’s easy to miss this assumption unless you’ve spent some time mulling over moral ontology, but it definitely animates most of the arguments made in this thread.
In short: unless you’ve grappled seriously with what you mean when you talk of moral rules, you have very little basis for assuming that you should be able to do sums with them. Is 6 billion punches for everyone “worse than” 50 years of torture for one person? It certainly involves the firing of more pain neurons. But the fact that a number of pain neurons fire is just a fact about the world; it isn’t the answer to a moral question, UNLESS you make a large number of assumptions. I agree that we can count neuron-firings, and do sums with them, and all other sorts of things. I just disagree that the firing of pain and pleasure neurons is the sum total of what we mean when we say “it was wrong of Fred to murder Sally.”
Unknown: 10 years and I would leave the lever alone, no doubt. 1 day is a very hard call; probably I would pull the lever. Most of us could get over 1 day over torture in a way that is fundamentally different from years of torture, after all.
Perhaps you can defend one punch per human being, but there must be some number of human beings for whom one punch each would outweigh torture.
As I said, I don’t have that intuition. A punch is a fairly trivial harm. I doubt I would ever feel worse about a lot of people (even 5^^^^^^5) getting punched than about a single individual being tortured for a lifetime. Sorry—I am just not very aggregative when it comes to these sorts of attitudes.
Is that “irrational?” Frankly, I’m not sure the word applies in the sense you mean. It is inconsistent with most accounts of strict utilitarianism. But I don’t agree that abstract ethical theories have truth values in the sense you probably assume. It is consistent with my attitudes and preferences, and with my society’s attitudes and preferences, I think. You assume that we should be able to add those attitudes up and do math with them, but I don’t see why that should necessarily be the case.
I think the difference is that you are assuming (at least in a very background sort of way) that there are non-natural, mind-independent, moral facts somehow engrafted onto the structure of reality. You feel like those entities should behave like physical entities, however, in being subject to the sort of mathematical relations we have developed based upon our interactions with real-world entities (even if those relations are now used abstractly). Even if you could make a strong argument for the existence of these sorts of moral rules, however, that is a far sight from saying that they should have an internal structure that behaves in a mathematically-tidy way.
You haven’t ever given reasons to think that ethical truths ought to obey mathematical rules; you’ve just assumed it. It’s easy to miss this assumption unless you’ve spent some time mulling over moral ontology, but it definitely animates most of the arguments made in this thread.
In short: unless you’ve grappled seriously with what you mean when you talk of moral rules, you have very little basis for assuming that you should be able to do sums with them. Is 6 billion punches for everyone “worse than” 50 years of torture for one person? It certainly involves the firing of more pain neurons. But the fact that a number of pain neurons fire is just a fact about the world; it isn’t the answer to a moral question, UNLESS you make a large number of assumptions. I agree that we can count neuron-firings, and do sums with them, and all other sorts of things. I just disagree that the firing of pain and pleasure neurons is the sum total of what we mean when we say “it was wrong of Fred to murder Sally.”