I think that morality arises from self-interest, once an agent takes a long term view on self-interest and takes into account the opinions of other agents and how one’s own reputation contributes to one’s self-interest in the long term.
Based on this, I will revise my previous estimate of the path morality is taking: it seems more probable that self-interest and future prediction would be the drivers. Agents who approximately implement my utility function would then receive more empathy (cooperation, consideration) as a second order effect.
Based on this, I will revise my previous estimate of the path morality is taking: it seems more probable that self-interest and future prediction would be the drivers. Agents who approximately implement my utility function would then receive more empathy (cooperation, consideration) as a second order effect.