You seem to be using the word “morality” as a synonym for “values” or “utility function.”
Often I do, particularly when talking about moralities other than human.
Other times I will make what I consider a relevant functional distinction between general values and moral values in humans—moral values involves approval/disapproval in a recursive way—punishing an attitude, punishing failure to punish an attitude, punishing a failure to punish the failure to punish an attitude. etc. And similarly for rewards.
Elaborating on your point about supporting the continued existence of those who support your morality, on purely consequentialist grounds, Clippy is likely to engage in this kind of recursive punishment/reward as well, so he wouldn’t be entirely a Lovecraftian horror until the power of humans became inconsequential to his ends.
consider morality to be a sort of abstract concept that concerns the wellbeing of people
Likely Clippy will be “concerned” about the well being of people to the extent that such well being concerns the making of paperclips. Does that make him moral, according to your usage?
What kind of concern counts, and for which people must it apply? Concern is a very broad concept, with the alternative being unconcerned or indifferent. Clippy isn’t likely to be indifferent, and neither are sociopaths or baby eaters. Sociopaths and babyeaters are likely concerned about their own wellbeing, at least.
Clippy is likely to engage in this kind of recursive punishment/reward as well, so he wouldn’t be entirely a Lovecraftian horror until the power of humans became inconsequential to his ends.
I don’t deny that it may be useful to create some sort of creature with nonhuman values for instrumental reasons. We do that all the time now. For instance, we create domestic cats, which if they are neurologically advanced enough to have values at all, are probably rather sociopathic; because of their instrumental value as companions and mousers.
Likely Clippy will be “concerned” about the well being of people to the extent that such well being concerns the making of paperclips. Does that make him moral, according to your usage?
Sorry, I should have been clearer. In order to count as morality the concern for the wellbeing of others has to be a terminal value. In Clippy’s case his concern is only an instrumental value, so he isn’t moral.
Often I do, particularly when talking about moralities other than human.
Other times I will make what I consider a relevant functional distinction between general values and moral values in humans—moral values involves approval/disapproval in a recursive way—punishing an attitude, punishing failure to punish an attitude, punishing a failure to punish the failure to punish an attitude. etc. And similarly for rewards.
Elaborating on your point about supporting the continued existence of those who support your morality, on purely consequentialist grounds, Clippy is likely to engage in this kind of recursive punishment/reward as well, so he wouldn’t be entirely a Lovecraftian horror until the power of humans became inconsequential to his ends.
Likely Clippy will be “concerned” about the well being of people to the extent that such well being concerns the making of paperclips. Does that make him moral, according to your usage?
What kind of concern counts, and for which people must it apply? Concern is a very broad concept, with the alternative being unconcerned or indifferent. Clippy isn’t likely to be indifferent, and neither are sociopaths or baby eaters. Sociopaths and babyeaters are likely concerned about their own wellbeing, at least.
I don’t deny that it may be useful to create some sort of creature with nonhuman values for instrumental reasons. We do that all the time now. For instance, we create domestic cats, which if they are neurologically advanced enough to have values at all, are probably rather sociopathic; because of their instrumental value as companions and mousers.
Sorry, I should have been clearer. In order to count as morality the concern for the wellbeing of others has to be a terminal value. In Clippy’s case his concern is only an instrumental value, so he isn’t moral.