“I am nice because it feels good to be nice. Don’t you have that?”
Not really, no. Or I mean sure, I sometimes feel so, but that’s not the reason why I’m nice.
“What is the reason, then?”
I’m nice because it’s instrumentally useful. Win-win situations are good. It doesn’t cost much for me to be nice. Even in the cases where the other person is not nice, revenge is a dish best served cold. Or not at all, not-nice people tend to be miserable enough that it constitutes an acausal punishment by itself. And in any case, the game theory math tends to show that cooperating in iterated games is usually a good default.
“Sounds like a lot of work to think through that every time?”
Not really. It’s not like I have to think through all that in every situation. I just feel good being nice. But sometimes I reflect on what happened and realize that niceness wasn’t a good policy there. Then I can decide that the feeling wasn’t adequate, and figure out how to nudge myself away from that the next time such a situation happens.
“You’re pretty detached from your feelings, huh?”
I do have a rather mechanistic perception of humans, especially myself.
“Why is that?”
What I was doing previously did not work. This works better.
“Isn’t it a bit sad and cynical to have to go through that kind of thinking?”
No! It’s extremely beautiful how the same niceness that comes to some people by instinct can also be derived from game theory. How even someone who doesn’t internally care a bit about how you feel, other than the instrumental benefits from it, can still be nice to you, not to mislead, but to trade. Sure, the wholesome appreciation is now oriented a bit more toward the dynamics rather than the agents. But I don’t see why it would be sad?
“We really have quite different kinds of minds, don’t we?”
This seems to sketch a mind design where the locus of terminal values is in emotions, and so non-emotional justifications are naturally instrumental. But terminal justifications/values can also be non-emotional, even if there’s some overlap and path-dependence, emotional causal reasons for how the non-emotional terminal values came to be.
I’m mostly just trying to point to the fact that that your first impressions on ethics of something are not always the onces you’d reflectively choose to keep. I’m also trying to explain how I do moral reflection. Something almost like the discussion above occurred to me recently, and the other person seemed to hold their view strongly.
If you discovered that there is some better / more correct formulation of game theory applicable to your situation, that recommends to backstab everyone in such and such specific ways, and deal great harm for small benefit for you, would you switch to acting on it?
I don’t see how this is relevant. In real world, all games are iterated games, and doing things like that will hurt your reputation gravely. Also, like, of course I would, I’d be a monster not to.
“I am nice because it feels good to be nice. Don’t you have that?”
Not really, no. Or I mean sure, I sometimes feel so, but that’s not the reason why I’m nice.
“What is the reason, then?”
I’m nice because it’s instrumentally useful. Win-win situations are good. It doesn’t cost much for me to be nice. Even in the cases where the other person is not nice, revenge is a dish best served cold. Or not at all, not-nice people tend to be miserable enough that it constitutes an acausal punishment by itself. And in any case, the game theory math tends to show that cooperating in iterated games is usually a good default.
“Sounds like a lot of work to think through that every time?”
Not really. It’s not like I have to think through all that in every situation. I just feel good being nice. But sometimes I reflect on what happened and realize that niceness wasn’t a good policy there. Then I can decide that the feeling wasn’t adequate, and figure out how to nudge myself away from that the next time such a situation happens.
“You’re pretty detached from your feelings, huh?”
I do have a rather mechanistic perception of humans, especially myself.
“Why is that?”
What I was doing previously did not work. This works better.
“Isn’t it a bit sad and cynical to have to go through that kind of thinking?”
No! It’s extremely beautiful how the same niceness that comes to some people by instinct can also be derived from game theory. How even someone who doesn’t internally care a bit about how you feel, other than the instrumental benefits from it, can still be nice to you, not to mislead, but to trade. Sure, the wholesome appreciation is now oriented a bit more toward the dynamics rather than the agents. But I don’t see why it would be sad?
“We really have quite different kinds of minds, don’t we?”
Apparently.
This seems to sketch a mind design where the locus of terminal values is in emotions, and so non-emotional justifications are naturally instrumental. But terminal justifications/values can also be non-emotional, even if there’s some overlap and path-dependence, emotional causal reasons for how the non-emotional terminal values came to be.
I’m mostly just trying to point to the fact that that your first impressions on ethics of something are not always the onces you’d reflectively choose to keep. I’m also trying to explain how I do moral reflection. Something almost like the discussion above occurred to me recently, and the other person seemed to hold their view strongly.
If you discovered that there is some better / more correct formulation of game theory applicable to your situation, that recommends to backstab everyone in such and such specific ways, and deal great harm for small benefit for you, would you switch to acting on it?
I don’t see how this is relevant. In real world, all games are iterated games, and doing things like that will hurt your reputation gravely. Also, like, of course I would, I’d be a monster not to.