The Belief Signaling Trilemma

One major issue for group rationality is signaling with dishonest beliefs. Beliefs are used as a signal to preserve your reputation or to show membership in a group. It happens subconsciously, and I believe it is the cause of most of the issues of both religions and political parties. It also happens on Less Wrong, most commonly though not sharing beliefs that you think people will disagree with.

First, let’s identify the problem. This is mostly from the viewpoint of ideal agents as opposed to actual humans. You are part of a community of rationalists. You discuss lots of issues, and you become familiar with the other members of your community. As a result, you start to learn which members of the community are the smartest. Of course, your measure of the intelligence of the members is biased towards people who say things that you agree with. The members who say things that you agree with build up more reputation in your mind. This reputation makes you more likely to trust other things that this person says. I am also a member of this community. I have opinions on many things, but there is one issue that I think really does not matter at all. On this issue, most of the community believes X, but I believe not X. By signaling belief in X, I increase my reputation in the community, and will cause other people to take more seriously my views on other issues I think are more important. Therefore, I choose to signal belief in X.

What is happening here is that:

(A) People are assigning reputation based on claims, and judging claims based partially on other beliefs signaled by the same person.

(B) People want their claims are taken seriously, and so take actions which will preserve and improve their reputation.

Therefore,

(C) People take signal beliefs that they believe are false because they are shared by the community.

Signaling honest beliefs is kind of like cooperating in a prisoners dilemma. It helps the community push towards reaching what you believe are valid conclusions, at the cost of your own reputation. It is possible for us to decide as a community that we want to cooperate, especially with tools such as anti-kibitzer. However, there is more than one way to do that. I think there are three options. I think they are all theoretically possible, but I think they are all bad.

(1) We can agree to stop assigning reputation based on beliefs.

This option is bad because there is a loss of information. People who made the right choice on one issue are more likely to make the right choice on other issues, and we are ignoring this correlation.

(2) We can agree to always report honest beliefs even though we know it will cost us reputation.

This option is bad because it encourages self-deception. If you commit to honestly report beliefs, and you can gain more reputation by reporting belief in X, you may trick yourself into thinking that you believe X.

(3) We can allow dishonest reporting of beliefs to continue.

This option is bad because it causes a bias. The community will get a source of evidence biased towards their current beliefs.

Which option do you prefer? Am I missing a fourth option? Is one of the choices obviously the best or obviously the worst? Should we combine them somehow? Am I modeling the problem entirely wrong?