It seems like you can make similar arguments for virtue ethics and acausal trade.
If another agent is able to simulate you well, then it helps them to coordinate with you by knowing what you will do without communicating. When you’re not able to have a good prediction of what other people will do, it takes waaay more computation to figure out how to get what you want, and if its compatible with them getting what they want.
By making yourself easily simulated, you open yourself up to ambient control, and by not being easily simulated you’re difficult to trust. Lawful Stupid seems to happen when you have too many rules enforced too inflexibly, and often (in literature) other characters can take advantage of that really easily.
It seems like you can make similar arguments for virtue ethics and acausal trade.
If another agent is able to simulate you well, then it helps them to coordinate with you by knowing what you will do without communicating. When you’re not able to have a good prediction of what other people will do, it takes waaay more computation to figure out how to get what you want, and if its compatible with them getting what they want.
By making yourself easily simulated, you open yourself up to ambient control, and by not being easily simulated you’re difficult to trust. Lawful Stupid seems to happen when you have too many rules enforced too inflexibly, and often (in literature) other characters can take advantage of that really easily.