Given that our evolved morality is in part used to solve cooperation and other game theoretic problems, a rational psychopath might want to self-modify to care about ‘morality’.
I would expect a rational psychopath to instead try to study game theory, and try to beat human players that will employ predictable strategies that can be exploited.
Because we evolved to care about things like fairness in an environment where everyone knew each other, and if you cheated someone, everyone else in the village knew it. And, modern humans still employ their evolved instincts. Therefore, agents who lack moral concerns can exploit the fact that humans are using intuitions that were optimized to work in a different situation. For instance, they can avoid doing things so heinous that society as a whole tries to hunt them down, and once they have exploited someone, they can just move.
Here’s what a moral realist might say:
The ‘morality’ module within the utility function is pretty similar across all humans.
Given that our evolved morality is in part used to solve cooperation and other game theoretic problems, a rational psychopath might want to self-modify to care about ‘morality’.
I would expect a rational psychopath to instead try to study game theory, and try to beat human players that will employ predictable strategies that can be exploited.
If there’s a long-term effective strategy for cheating—one that doesn’t involve the cheater being detected and punished—why isn’t everone using it?
Because we evolved to care about things like fairness in an environment where everyone knew each other, and if you cheated someone, everyone else in the village knew it. And, modern humans still employ their evolved instincts. Therefore, agents who lack moral concerns can exploit the fact that humans are using intuitions that were optimized to work in a different situation. For instance, they can avoid doing things so heinous that society as a whole tries to hunt them down, and once they have exploited someone, they can just move.