The following is going to use a really crude ontology but I think my point will survive it. Model the part of our brains that tells us what is moral as simply a subroutine of a larger utility pseudo-function that determines what we do (which also includes, say, selfish pleasures) . It seems both possible and plausible that the “morality part” will return a value for some action that the larger function does not. And of course, this happens all the time. Most people have felt like they failed to live up to their own values. If you program an AI to do what a subroutine of your utility function says than it is totally possible for the AI to ask you to do something you really don’t want to do.
It’s a problem for total utilitarianism insofar as it is yet another bullet to bite. Though, this is really just a new description of the utility monster bullet. I’m not a total utilitarian and I find such bullets too bitter. But the mere existence of a gap between “what we should do” and “what we want to do” doesn’t seem very important.
The following is going to use a really crude ontology but I think my point will survive it. Model the part of our brains that tells us what is moral as simply a subroutine of a larger utility pseudo-function that determines what we do (which also includes, say, selfish pleasures) . It seems both possible and plausible that the “morality part” will return a value for some action that the larger function does not. And of course, this happens all the time. Most people have felt like they failed to live up to their own values. If you program an AI to do what a subroutine of your utility function says than it is totally possible for the AI to ask you to do something you really don’t want to do.
It’s a problem for total utilitarianism insofar as it is yet another bullet to bite. Though, this is really just a new description of the utility monster bullet. I’m not a total utilitarian and I find such bullets too bitter. But the mere existence of a gap between “what we should do” and “what we want to do” doesn’t seem very important.