Social incentives affecting beliefs

Having weird ideas relative to your friends and associates means paying social costs. If you share your weird ideas, you’ll have more arguments, your associates will see you as weird and you’ll experience some degree of rejection and decreased status. If you keep your weird ideas to yourself, you’ll have to lead a double life of secret constructed knowledge on the one hand and public facade on the other.

For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I’ve heard from reliable sources that the stereotypes about the South are accurate.) Not many of us would choose to do this voluntarily.

The weirder your beliefs get relative to your peer group, the greater the social costs you’ll have to pay. Imagine we plot the beliefs of your associates on a multidimensional plot and put a hook at the center of mass of this plot. Picture yourself attached with an elastic band to this hook. The farther you stray from the center of mass, the greater the force pulling you towards conventional beliefs.

This theorizing has a few straightforward implications:

  • If you notice yourself paying a high social or psychological cost for your current set of beliefs, and you have reasons to not abandon the beliefs (e.g. you think they’re correct), consider trying to find a new set of associates where those psychosocial costs are lower (either people who agree with you more, people who are less judgmental, or some combination) so you can stop paying the costs. If you can’t find any such associates, create some: convince a close friend or two of your beliefs, so you have a new center of mass to anchor yourself on. Also cultivate psychological health through improving your relationships, meditation, self-love and acceptance, etc.

  • If you’re trying to help a group have accurate beliefs on aggregate, stay nonjudgmental so that the forces pulling people towards conventional wisdom will be lower, and they’ll be more influenced by the evidence they encounter as opposed to the incentives they encounter. You may say “well, I’m only judgmental towards peoples’ beliefs when they’re incorrect.” But even if you happen to be perfect at figuring out which beliefs are incorrect, this is still a bad idea. If I’m trying to figure out whether to officially adopt some belief as part of my thinking, I’ll calculate my expected social cost of holding the belief using the probability that it’s incorrect times the penalty in the case where it’s incorrect. So even punishing only the incorrect beliefs will counterfactually decrease the rate of people holding unusual beliefs.

Some more bizarre ideas:

  • Deliberately habituate yourself to/​adapt to the social costs associated with having weird ideas. Practice coming across as “eccentric” rather than “kooky” when explaining your weird ideas, and state them confidently as if they’re naturally and obviously true, to decrease status loss effects. Consider adopting a posture of aloofness or mystery. Or for a completely alternative approach, deliberately adopt a few beliefs that you suspect are true but your social group rejects, and keep them secret to practice having your own model of the world independent of that of your social group.

  • If you notice a weird idea of yours is either not getting adopted by you because of social costs, or is costing you “rent” in terms of social costs you are having to pay to maintain it, do a cost-benefit analysis and deliberately either maintain the belief and pay the upkeep costs or discard it from your everyday mental life (preferably making a note at the time you discard it). You have to pick your battles.

  • Start being kinda proud of the weird things you think you’ve figured out, in order to cancel out the psychosocial punishment for weird ideas with a dose of psychosocial reward. Keep your pride to yourself to avoid being humiliated if your beliefs turn out to be proven wrong. The point is to be guided only by the evidence you have, even if that evidence is biased or incomplete, rather than solely the opinion of the herd. (Of course, the herd’s opinion should be considered evidence. But if you’re doing it right, you’ll err on the side of agreeing with the herd too much and agreeing with the herd too little about the same amount… unfortunately, agreeing with the herd too little and being wrong generally hurts you much more than agreeing with the herd too much and being wrong.)