I think the way around that issue is to bite the bullet—those things belong in a proper theory of mind. Most people want to be conformist (or at least to maintain a pleasant-to-them self-image) more than they want to be rich. That seems like a truth (lowercase t—it’s culture-sensitive, not necessarily universal) that should be modeled more than a trap to be avoided.
But people are still leaving a lot of efficient, low effort, conformity on the table—a superintelligent conformist human could be so much better at being (or appearing) conformist, than we can ever manage.
I think the way around that issue is to bite the bullet—those things belong in a proper theory of mind. Most people want to be conformist (or at least to maintain a pleasant-to-them self-image) more than they want to be rich. That seems like a truth (lowercase t—it’s culture-sensitive, not necessarily universal) that should be modeled more than a trap to be avoided.
But people are still leaving a lot of efficient, low effort, conformity on the table—a superintelligent conformist human could be so much better at being (or appearing) conformist, than we can ever manage.
So a model that says people are ‘super intelligent’ would be badly wrong.