Your morality is which part of your goals? If there is no criterion distinguishing moral goals from non moral ones , then a society of selfish jerks who always defect would be 100% moral. But if morality is related to unselfish, cooperative behaviour, as most people believe, game theory is potentially relevant.
(There’s a posting where Yudkowsky kind-of-sort argues for the morality-is-goals theory, and a subsequent one where he notices the problem and starts talking about non-jerkish values).
The selfish jerks would not be rational, as they wouldn’t be winning. That’s what the game theory is about.
The game theory is independent of morality. In some such games, winning happens to involve being good to your neighbours. But in others, winning may involve doing evil to your neighbours. It would be nice if the morally best action (“What is hateful to you, do not do to your neighbour”) were always to be the selfishly winningest one, but while the examples have this property, I do not think it has been established in general.
The selfish jerks would not be rational, as they wouldn’t be winning
They would be doing as well as you can if you refuse to cooperate. More importantly, co operation isn’t a Pareto improvement on defection, because some cooperators take a hit, because not everything is a PD.
But in others, winning may involve doing evil to your neighbours. It would be nice if the morally best action (“What is hateful to you, do not do to your neighbour”) were always to be the selfishly winningest one,
It isn’t. It’s not a Pareto improvement.
Playing co-operatively can “grow the pie” or produce more overall value even if generating individual losers
The way you operate, among other things, effects how things would turn out if you were doing something like playing against copies of yourself. ‘Morally’ (depending on what you think is moral) can perhaps be described a subset of ways of operating. Less ‘a connection to morality’ and more ‘does morality have certain optimality* properties?’, focusing on figuring out said properties and then drawing some comparisons to morality.
Your morality is a part of your goals, what your instrumental rationality is pursuing. Therefore there is no conflict between them.
The rest of the post seems to be about game-theoretic issues, and I don’t see a connection to morality there.
Your morality is which part of your goals? If there is no criterion distinguishing moral goals from non moral ones , then a society of selfish jerks who always defect would be 100% moral. But if morality is related to unselfish, cooperative behaviour, as most people believe, game theory is potentially relevant.
(There’s a posting where Yudkowsky kind-of-sort argues for the morality-is-goals theory, and a subsequent one where he notices the problem and starts talking about non-jerkish values).
The selfish jerks would not be rational, as they wouldn’t be winning. That’s what the game theory is about.
The game theory is independent of morality. In some such games, winning happens to involve being good to your neighbours. But in others, winning may involve doing evil to your neighbours. It would be nice if the morally best action (“What is hateful to you, do not do to your neighbour”) were always to be the selfishly winningest one, but while the examples have this property, I do not think it has been established in general.
They would be doing as well as you can if you refuse to cooperate. More importantly, co operation isn’t a Pareto improvement on defection, because some cooperators take a hit, because not everything is a PD.
It isn’t. It’s not a Pareto improvement.
Playing co-operatively can “grow the pie” or produce more overall value even if generating individual losers
The way you operate, among other things, effects how things would turn out if you were doing something like playing against copies of yourself. ‘Morally’ (depending on what you think is moral) can perhaps be described a subset of ways of operating. Less ‘a connection to morality’ and more ‘does morality have certain optimality* properties?’, focusing on figuring out said properties and then drawing some comparisons to morality.
*Hence the game-theoretic issues.