@1a3orn goes deeper into another dynamic that causes groups to have false beliefs while believing they are true, and it’s the fact that some bullshit beliefs help you figure out who to exclude, which is the people who don’t currently hold the belief, and in particular assholery also helps people who don’t want their claims checked, and it’s a reason I think politeness is actually useful in practice for rationality:
(Sharmake’s first tweet): I wrote something on a general version of this selection effect, and why it’s so hard to evaluate surprising/extreme claims relative to your beliefs, and it’s even harder if we expect heavy-tailed performance, as happens in our universe.
(1a3orn’s claims) This is good.
I think another important aspect of the multi-stage dynamic here is that it predicts that movements with *worse* stages at some point have fewer contrary arguments at later points...
...and in this respect is like an advance-fee scam, where deliberately non-credible aspects of the story help filter people early on so that only people apt to buy-in reach later parts.
So it might be adaptive (survivalwise) for a memeplex to have some bullshit beliefs because the filtering effect of these means that there will be fewer refutations of the rest of the beliefs.
It can also be adaptive (survivalwise) for a leader of some belief system to be abrasive, an asshole, etc, because fewer people will bother reading them ⇒ “wow look how no one can refute my arguments”
(Sharmake’s response) I didn’t cover the case where the belief structure is set up as a scam, and instead focused on where even if we are assuming LWers are trying to get at truth and aren’t adversarial, the very fact that this effect exists combined with heavy-tails makes it hard to evaluate claims.
But good points anyway.
(1a3orn’s final point)
Yeah tbc, I think that if you just blindly run natural selection over belief systems, you get belief systems shaped like this regardless of the intentions of the people inside it.
It’s just an effective structure.
@1a3orn goes deeper into another dynamic that causes groups to have false beliefs while believing they are true, and it’s the fact that some bullshit beliefs help you figure out who to exclude, which is the people who don’t currently hold the belief, and in particular assholery also helps people who don’t want their claims checked, and it’s a reason I think politeness is actually useful in practice for rationality:
Quotes from this tweet thread.