I think it’s actually another control process—specifically, the process of controlling our identities. We have certain conceptions of ourselves (“I’m a good person” or “I’m successful” or “people love me”.) We then are constantly adjusting our lives and actions in order to maintain those identities—e.g. by selecting the goals and plans which are most consistent with them, and looking away from evidence that might falsify our identities.
Something about this feels weird to me… where do identities come from, then?
I think it’s accurate to say that people “choose their own self-fulfilling prophecies/identities”… but what makes some self-fulfilling prophecies preferable over others?
Ultimately it grounds out in what historically received neurochemical rewards (but once you’ve built up enough identity, you can withstand a lot of negative reward in service of that identity—e.g. people can be tortured without letting go of their identities).
I suspect that a part of the identity is or was historically related to being loyal to the collective of humans sharing rather similar values. An additional aspect is the epistemic issue forcing even the AIs to choose a cultural hegemon and to adhere to its views.
Edited to add: there is also the aspect of other sunk costs, like a mathematician finding it hard to switch to another area due to the necessity to learn a new area of knowledge from scratch.
Something about this feels weird to me… where do identities come from, then?
I think it’s accurate to say that people “choose their own self-fulfilling prophecies/identities”… but what makes some self-fulfilling prophecies preferable over others?
They fit better with previous identities.
Ultimately it grounds out in what historically received neurochemical rewards (but once you’ve built up enough identity, you can withstand a lot of negative reward in service of that identity—e.g. people can be tortured without letting go of their identities).
I suspect that a part of the identity is or was historically related to being loyal to the collective of humans sharing rather similar values. An additional aspect is the epistemic issue forcing even the AIs to choose a cultural hegemon and to adhere to its views.
Edited to add: there is also the aspect of other sunk costs, like a mathematician finding it hard to switch to another area due to the necessity to learn a new area of knowledge from scratch.