So, utilitarianism isn’t true, it is a matter of taste (preferences, values, etc...)?
Saying utilitarianism isn’t true because some people aren’t automatically motivated to follow it is like saying that grass isn’t green because some people wish it was purple. If you don’t want to follow utilitarian ethics that doesn’t mean they aren’t true. It just means that you’re not nearly as good a person as someone who does. If you genuinely want to be a bad person then nothing can change your mind, but most human beings place at least some value on morality.
You’re confusing moral truth with motivational internalism. Motivational internalism states that moral knowledge is intrinsically motivating, simply knowing something is good and right motivates a rational entity to do it. That’s obviously false.
Its opposite is motivational externalism, which states that we are motivated to act morally by our moral emotions (i. e. sympathy, compassion) and willpower. Motivational externalism seems obviously correct to me. That in turn indicates that people will often act immorally if their willpower, compassion, and other moral emotions are depleted, even if they know intellectually that their behavior is less moral than it could be.
If you have ever purchased a birthday present for, say, your husband instead of feeding the hungry (who would have gotten more utility from those particular resources), then to that extent your values are not utilitarian (as demonstrated by WARP).
There is a vast, vast amount of writing at Less Wrong on the fact that people’s behavior and their values often fail to coincide. Have you never read anything on the topic of “akrasia?” Revealed preference is moderately informative in regards to people’s values, but it is nowhere near 100% reliable. If someone talks about how utilitarianism is correct, but often fails to act in utilitarian ways, it is highly likely they are suffering from akrasia and lack the willpower to act on their values.
Even if you could measure utility perfectly and perform rock-solid interpersonal utility calculations, I suspect that you would still not weigh your own well-being (nor your husband, friends, etc...) equally with that of random strangers. If I am right about this, then your defence of utilitarianism as your own personal system of value fails on the ground that it is a false claim about a particular person’s preferences (namely, you).
You don’t seem to understand the difference between categorical and incremental preferences. If juliawise spends 50% of her time doing selfish stuff and 50% of her time doing utilitarian stuff that doesn’t mean she has no preference for utilitarianism. That would be like saying that I don’t have a preference for pizza because I sometimes eat pizza and sometimes eat tacos.
Furthermore, I expect that if juliawise was given a magic drug that completely removed her akrasia she would behave in a much more utilitarian fashion.
As for the former, I have requested of sophisticated and knowledgeable utilitarians that they tell me what experiences I should anticipate in the world if utilitarianism is true (and that I should not anticipate if other, contradictory, moral theories were true) and, so far, they have been unable to do so.
If utilitarianism was true we could expect to see a correlation between willpower and morally positive behavior. This appears to be true, in fact such behaviors are lumped together into the trait “conscientiousness” because they are correlated.
If utilitarianism was true then deontological rule systems would be vulnerable to dutch-booking, while utilitarianism would not be. This appears to be true.
If utilitarianism was true then it would be unfair to for multiple people to have different utility levels, all else being equal. This is practically tautological.
If utilitarianism was true then goodness would consist primarily of doing things that benefit yourself and others. Again, this is practically tautological.
Now, these pieces of evidence don’t necessarily point to utilitarianism, other types of consequentialist theories might also explain them. But they are informative.
As for the latter, according to my revealed preferences, utilitarianism does not describe my preferences at all accurately, so is not much use for determining how to act. Simply, it is not, in fact, my value system.
Again, ethical systems are not intrinsically motivating. If you don’t want to follow utilitarianism then that doesn’t mean it’s not true, it just means that you’re a person who sometimes treats other people unfairly and badly. Again, if that doesn’t bother you then there are no universally compelling arguments. But if you’re a reasonably normal human it might bother you a little and make you want to find a consistent system to guide you in your attempts to behave better. Like utilitarianism.
Saying utilitarianism isn’t true because some people aren’t automatically motivated to follow it is like saying that grass isn’t green because some people wish it was purple. If you don’t want to follow utilitarian ethics that doesn’t mean they aren’t true. It just means that you’re not nearly as good a person as someone who does. If you genuinely want to be a bad person then nothing can change your mind, but most human beings place at least some value on morality.
You’re confusing moral truth with motivational internalism. Motivational internalism states that moral knowledge is intrinsically motivating, simply knowing something is good and right motivates a rational entity to do it. That’s obviously false.
Its opposite is motivational externalism, which states that we are motivated to act morally by our moral emotions (i. e. sympathy, compassion) and willpower. Motivational externalism seems obviously correct to me. That in turn indicates that people will often act immorally if their willpower, compassion, and other moral emotions are depleted, even if they know intellectually that their behavior is less moral than it could be.
There is a vast, vast amount of writing at Less Wrong on the fact that people’s behavior and their values often fail to coincide. Have you never read anything on the topic of “akrasia?” Revealed preference is moderately informative in regards to people’s values, but it is nowhere near 100% reliable. If someone talks about how utilitarianism is correct, but often fails to act in utilitarian ways, it is highly likely they are suffering from akrasia and lack the willpower to act on their values.
You don’t seem to understand the difference between categorical and incremental preferences. If juliawise spends 50% of her time doing selfish stuff and 50% of her time doing utilitarian stuff that doesn’t mean she has no preference for utilitarianism. That would be like saying that I don’t have a preference for pizza because I sometimes eat pizza and sometimes eat tacos.
Furthermore, I expect that if juliawise was given a magic drug that completely removed her akrasia she would behave in a much more utilitarian fashion.
If utilitarianism was true we could expect to see a correlation between willpower and morally positive behavior. This appears to be true, in fact such behaviors are lumped together into the trait “conscientiousness” because they are correlated.
If utilitarianism was true then deontological rule systems would be vulnerable to dutch-booking, while utilitarianism would not be. This appears to be true.
If utilitarianism was true then it would be unfair to for multiple people to have different utility levels, all else being equal. This is practically tautological.
If utilitarianism was true then goodness would consist primarily of doing things that benefit yourself and others. Again, this is practically tautological.
Now, these pieces of evidence don’t necessarily point to utilitarianism, other types of consequentialist theories might also explain them. But they are informative.
Again, ethical systems are not intrinsically motivating. If you don’t want to follow utilitarianism then that doesn’t mean it’s not true, it just means that you’re a person who sometimes treats other people unfairly and badly. Again, if that doesn’t bother you then there are no universally compelling arguments. But if you’re a reasonably normal human it might bother you a little and make you want to find a consistent system to guide you in your attempts to behave better. Like utilitarianism.