Unfortunate Information

I’m growing increasingly convinced the unfortunate correlations between types of people and types of arguments leads to persistent biases in uncovering actual knowledge.

As an example, MR wrote this article (they just linked to again today) on Ben Carson in 201511. Cowen’s argument is that, while perhaps implausible (although it may have tenuous support), the idea that Carson believes that the pyramids were used as grain storage isn’t in any possible way less unrealistic than any other religious beliefs. If anything, that singular belief is relatively realistic compared to more widely accepted miracles in Christianity, or similar religions.

So why does he get so much flak for it? Cowen argues that he shouldn’t, that it’s unfounded and irrational/​inconsistent. Is it? He obviously has a fair point. The downside is that despite the belief when analyzed not being particularly ridiculous, we all have a shared estimation/​expectation that people who hold this type of belief (let’s call them Class B religious beliefs) ARE particularly ridiculous.

This then creates a new equilibrium, where only those people who take their Class B religious beliefs *very* seriously will share them. As a result when Carson says the pyramids have grain, our impulse is “wacky!” But when Obama implies he believes Jesus rose from the dead, our impulse is “Boilerplate—he probably doesn’t give it too much thought—it’s a typical belief, which he might not even believe.”

As a result we get this constant mismatch between the type of person to hold a belief, and the truth value of the belief itself. I don’t mean to only bring up controversial examples, but it’s no surprise that this is where these examples thrive. HBD is another common one. While there is something there, which after a fair amount of reading I suspect is overlooked, the type of person to be really passionate about HBD is (more often than not, with exceptions), not the type of person you want over for dinner.

This can suck for people like us. On one hand we want to evaluate individual pieces of information, models, or arguments, based on how they map to reality. On the other hand, if we advocate or argue for information that is correlated with an unsavory type of person, we are classified as that type of person. In this sense, for someone who values a good social standing and no risk to their career as primary objectives, it would be irrational to publicly blog about controversial topics. It’s funny, Scott Alexander was retweeted by Anne Coulter for his SCC blog on Trump. He was thrilled, but imagine if he was an aspiring professor? I think he would probably still be fine, because his unique level of genius would still shine through, but lately professors I know who have non-mainstream political views have stopped publicly sharing for fear of controversy.

This is a topic I think about a lot, and now notice becoming a bigger issue in the US. And I wonder directly how to respond. The contradiction between rationally evaluating an idea and irrationally sharing analysis is growing.

*(http://​​marginalrevolution.com/​​marginalrevolution/​​2015/​​11/​​bully-for-ben-carson.html)