there’s a lot of work showing how [utilitarianism] aligns pretty closely with most people’s moral intuitions
There is? Could you link to some examples?
It’s my understanding that utilitarianism does not align with most people’s moral intuitions, in fact. I would be at least moderately surprised to learn that the opposite is true.
You might want to read up on utilitarianism if you haven’t, because you’ll find it the starting point for many discussions of ethics on LessWrong.
Utilitarianism, however, has many, many problems. How familiar are you with critiques of it?
I’m not arguing that utilitarianism is correct in any absolute sense, or that it aligns perfectly with moral intuitions. I was just trying to explain why so many people around here are so into it. I’m familiar with many critiques of utilitarianism. I’m not aware of any ethical system that aligns better with moral intuitions. No system is going to align perfectly with our moral intuitions because they’re not systematic.
What am I supposed to be getting out of that? Inasmuch as it is a half hearted defence of deontology, it isn’t a wholehearted defence of pure utilitarianism.
Eliezer is usually viewed as a Utilitarian, which would make him a consequentialist. His point in that article seems to be an acknowledgement that because human thinking is so prone to self-justification, deontology has its merits. Which I thought related to your point on caring about intentions as well as effects.
Rather the opposite. Utilitarianism cares about outcomes, so to first order it doesn’t factor intentions in at all. Of course, if someone intends to harm me, somehow fails and instead unintentionally does me good, while I haven’t been harmed yet, I do have a reasonable concern that they might try again, perhaps more successfully next time. So intentions matter under Utilitarianism to the extent that they can be used to predict the probabilities of outcomes. Plus of course to the extent that they hurt feelings or cause concern, and those are actual emotional harms.
There is? Could you link to some examples?
It’s my understanding that utilitarianism does not align with most people’s moral intuitions, in fact. I would be at least moderately surprised to learn that the opposite is true.
Utilitarianism, however, has many, many problems. How familiar are you with critiques of it?
I’m not arguing that utilitarianism is correct in any absolute sense, or that it aligns perfectly with moral intuitions. I was just trying to explain why so many people around here are so into it. I’m familiar with many critiques of utilitarianism. I’m not aware of any ethical system that aligns better with moral intuitions. No system is going to align perfectly with our moral intuitions because they’re not systematic.
Any system with a slot for intention does.
Have you read Eliezer’s Ends Don’t Justify Means (Among Humans)?
What am I supposed to be getting out of that? Inasmuch as it is a half hearted defence of deontology, it isn’t a wholehearted defence of pure utilitarianism.
Eliezer is usually viewed as a Utilitarian, which would make him a consequentialist. His point in that article seems to be an acknowledgement that because human thinking is so prone to self-justification, deontology has its merits. Which I thought related to your point on caring about intentions as well as effects.
It’s not a given that utilitarianism involves caring about intentions.
Rather the opposite. Utilitarianism cares about outcomes, so to first order it doesn’t factor intentions in at all. Of course, if someone intends to harm me, somehow fails and instead unintentionally does me good, while I haven’t been harmed yet, I do have a reasonable concern that they might try again, perhaps more successfully next time. So intentions matter under Utilitarianism to the extent that they can be used to predict the probabilities of outcomes. Plus of course to the extent that they hurt feelings or cause concern, and those are actual emotional harms.
Whose moral intuitions? Clearly not everyone’s. But most people’s? Is that your claim? Or only yours? Or most people’s on Less Wrong? Or…?