The word “utilitarianism” technically means something like, “an algorithm for determining whether any given action should or should not be undertaken, given some predetermined utility function”. However, when most people think of utilitarianism, they usually have a very specific utility function in mind. Taken together, the algorithm and the function do indeed imply certain “ethical obligations”, which are somewhat tautologically defined as “doing whatever maximizes this utility function”.
In general, the word “utilitarian” has been effectively re-defined in common speech as something like, “ruthlessly efficient to the point of extreme ugliness”, so utilitarianism gets the horns effect from that.
an algorithm for determining whether any given action should or should not be undertaken, given some predetermined utility function
That’s not how the term “utilitarianism” is used in philosophy. The utility function has to be agent neutral. So a utility function where your welfare counts 10x as much as everyone else’s wouldn’t be utilitarian.
The word “utilitarianism” technically means something like, “an algorithm for determining whether any given action should or should not be undertaken, given some predetermined utility function”. However, when most people think of utilitarianism, they usually have a very specific utility function in mind. Taken together, the algorithm and the function do indeed imply certain “ethical obligations”, which are somewhat tautologically defined as “doing whatever maximizes this utility function”.
In general, the word “utilitarian” has been effectively re-defined in common speech as something like, “ruthlessly efficient to the point of extreme ugliness”, so utilitarianism gets the horns effect from that.
That’s not how the term “utilitarianism” is used in philosophy. The utility function has to be agent neutral. So a utility function where your welfare counts 10x as much as everyone else’s wouldn’t be utilitarian.