As I write this, it occurs to me that loyalty interacts with prestige in a way that illuminates both phenomena.
Within an existing team (including friendship “teams” of size two), your prestige reflects your value to your teammates. More precisely, it’s the expected value of all your future contributions to the team — your NPV, if you will.
Now, in light of this, how might you go about increasing your prestige?
One strategy is to improve yourself — by learning new skills, for example — so that you’ll be able to accomplish things of greater value. You thereby increase the expected size of your future contributions to the team.
But there’s another, complementary strategy: try to increase the expected number of your future contributions. How? By convincing your teammates that you’re likely to stick around longer.
The transaction is simple. In return for your loyalty, you earn your teammates’ trust. And the interesting thing about trust — arguably, the essential thing — is that it isn’t portable. When you leave a team, you can take everything else with you (skills, knowledge, money), but all that accumulated trust stays behind.
In other words, prestige has two components: a general-purpose “global” component and a team-specific “local” component. Global prestige includes everything valued on the outside market: skills, knowledge, money, relationships with outsiders. Local prestige includes everything not transferable to the outside market — trust, relationships with teammates, and team-specific knowledge and skills.
I think this helps explain why loyalty-signaling practices are so powerful, and produce such dramatic effects. Watch how loyalty-signaling can quickly escalate out of control:
We begin with an initiate, who wishes to raise his value by demonstrating loyalty to his group. Words aren’t enough: he needs to make honest (costly) commitments, i.e., by doing things that make it harder for him to leave the group. Techniques here include: severing ties with outsiders, doubling down on relationships with insiders, and undertaking lifestyle changes (diet, clothes, living arrangements) that make it harder to interact with the outside world.
In return for these demonstrations of loyalty, the initiate is rewarded with trust, i.e., local prestige — something that increases his value within the group, but which has no benefit to him if he decides to leave. In other words, his reward for binding himself to the group is… something that further binds him to the group.
Unfortunately local prestige, like global prestige, is a zero-sum game. So in order to compete for it, team members need to out-do each other, e.g., with even more extreme loyalty displays. This kind of competition is similar to what we find in any other prestige tournament (art, music, sports, academia, etc.). The main difference lies in the direction competitors are selling themselves for prestige: artists and athletes sell themselves outward, while loyalty-signaling teammates sell inward.
Finally, if processes 1–3 are strong enough, most group members will end up fairly committed to the group. They’ll draw their admiration largely from other group members, and they’ll experience a large drop in status if they ever try to leave. All this, in turn, gives everyone a strong incentive to make sure everyone else remains loyal to the group. The peer pressure that results is likely to be intense.
Religions often take these processes to an extreme. Adherents scramble to signal commitment as a way of jockeying for local prestige. In this context, everyone is anxious to do and say the right things. Schelling points for “proper” beliefs and behaviors are established quickly, resulting in capricious orthodoxies and bizarre ritual practices. And because loyalty is what’s at stake, the group will tend to prefer beliefs and behaviors that are costly to maintain and perform. Orthodox Jews spurn food from non-Kosher kitchens. Fundamentalist Christians deny evolution. Christian Scientists refuse blood transfusions. Mormons wear special underwear. And every religion asks for weekly devotion. In each of these “transactions,” adherents sacrifice their status with outsiders as part of a calculated gambit to earn greater status among their co-religionists.
And when the conditions are just right — when the incentives to signal loyalty are strong enough, and the countervailing incentives weak enough — a community can undergo something like gravitational collapse. These groups then become perfectly insular communities, social black holes from which escape is all but impossible.
This all felt pretty relevant.
I was actually re-reading the post mostly in the context of personal relationship and small groups of friends, wherein I think trust and loyalty are actually good qualities (and possibly one of the core things I value). I don’t currently have a principled distinction between when loyalty and loyalty-signaling is healthy vs unhealthy, but in general it makes sense that ‘things that made sense in small groups or village-sized-entities become unhealthy when industrially scaled up’
Trust and loyalty seem to me to clearly be virtues if placed wisely and in moderation. Like all virtues, you go too far and bad things happen. Industrial scaling gives you new ways to backfire, but there’s certainly very non-industrial ways to go way overboard on either or both. Cults can be very small.
I was recently re-reading Social Status II: Cults and Loyalty, for unrelated reasons.
This all felt pretty relevant.
I was actually re-reading the post mostly in the context of personal relationship and small groups of friends, wherein I think trust and loyalty are actually good qualities (and possibly one of the core things I value). I don’t currently have a principled distinction between when loyalty and loyalty-signaling is healthy vs unhealthy, but in general it makes sense that ‘things that made sense in small groups or village-sized-entities become unhealthy when industrially scaled up’
Trust and loyalty seem to me to clearly be virtues if placed wisely and in moderation. Like all virtues, you go too far and bad things happen. Industrial scaling gives you new ways to backfire, but there’s certainly very non-industrial ways to go way overboard on either or both. Cults can be very small.