Ok. Fine maybe it’s signaling. I’m ok with that since the part of me that does really care thinks “if my desire to signal leads me to help effectively, then it’s fine in my book”, but then I’m fascinated because that part of me may, actually, be motivated by the my desire to signal my kindness. It may be signalling “all the way down”, but it seems to be alternating levels of signaling motivated by altruism motivated by signalling. Maybe it eventually stabilizes at one or the other.
I don’t care. Whether I’m doing it out of altruism or doing it for signaling (or, as I personally think, neither, but rather something more complex, involving my choice of personal identity which I suspect uses the neural architecture that was developed for playing status games, but has been generalized to be compared to an abstract ideal instead of other agent), I do want to be maximally effective.
If I know what my goals are, what motivates them is not of great consequence.
Ok. Fine maybe it’s signaling. I’m ok with that since the part of me that does really care thinks “if my desire to signal leads me to help effectively, then it’s fine in my book”, but then I’m fascinated because that part of me may, actually, be motivated by the my desire to signal my kindness. It may be signalling “all the way down”, but it seems to be alternating levels of signaling motivated by altruism motivated by signalling. Maybe it eventually stabilizes at one or the other.
I don’t care. Whether I’m doing it out of altruism or doing it for signaling (or, as I personally think, neither, but rather something more complex, involving my choice of personal identity which I suspect uses the neural architecture that was developed for playing status games, but has been generalized to be compared to an abstract ideal instead of other agent), I do want to be maximally effective.
If I know what my goals are, what motivates them is not of great consequence.