I fell into conspiratorial thinking when I was less experienced in the ways of the world. Noticing that a dark OD green helicopter with subdued markings could easily be mistaken for an unmarked black helicopter when viewed against a bright sky by an untrained observer was the start of finding my way out of that one. I remember that some things that were claimed I recognized immediately (everyone has seen black SWAT uniforms, for example) and this created a sense of credibility, while others I suspected (I was anti-internationalist back then), and I didn’t have direct knowledge of others so I didn’t try to evaluate those things but instead gave the theory a pass when it came to those. It didn’t seem like an aberration at the time, but I didn’t know about the conjunction fallacy back then. The theory certainly reinforced the beliefs I held back then, and it also gave me a feeling of superiority because I knew a secret. Yup, I was a sucker.
I’ve come to notice that conspiracy theories tend to be based on two or more things that are known or believed to be true, but that are not normally expected to be linked. These are then joined by a pretext for linking them in a way that would be considered outrageous to anyone that doesn’t hold the theory—some of these details are often completely inconsequential of themselves and do not necessarily add anything of substance to the theory, but adding a few high-probability elements skews the conjunction fallacy in the theory’s favor. It is worth noting that sometimes major parts of a theory are not known, but classic propaganda techniques like the “big lie” are used to cause us to evaluate their probability more highly than we should. The latter is not strictly necessary though due to the power of the conjunction fallacy. Today I take it as a red flag when I see a linkage being formed between one thing that is obvious and another that is obvious, but where the linkage is tenuous. If you have to ask “where are you going with this” the answer probably isn’t good.
This linkage is then applied to combine the theory with multiple things that are not known but are either suspected by the individual or group, or are inspired by the theory, and that can be linked by conjecture to the concept of the conspiracy. Once the basic linkage is accepted the conjunction fallacy keeps rolling and that high probability that is erroneously given to the initial theory then gets associated with all of the other craziness.
A fake conspiracy theory can demonstrate how the conjunction fallacy plays into this. “We know that we orbit the sun, and we know the sun causes skin cancer, but did you know that the earth, sun, and human race were created by ancient aliens as part of a skin cancer experiment?” Our intuition evaluates that by giving 100% to the first two things, and since we can’t know the third we mentally skip it which would be like giving it 0% probability, but that’s okay because 100% + 100% + 0% = 200% probability so we are believers now. Once we are believers, we give the entire idea 100% likelihood, so we know that all of the components have to be true so they all have 100% likelihood in our mind and now we know that we are alien skin cancer test subjects. But if that is true then sunscreen was probably invented by the aliens as part of their testing protocol because otherwise, why would they let us use something that would reduce our chance of getting skin cancer? So now we know that sun screen is a perverse part of the alien plot and we refuse to buy it and try to convince our friends that it is evil.
Obviously, we should instead be multiplying probabilities, where 100% 100% 0% = 0%. But that isn’t the way most people tend to think. Instead most people rely on whether or not one of the foundational contexts of the theory contradicts or supports their biases, so most people who don’t follow conspiracy theories are probably not doing so because they made the correct calculation of probability but because they “know” one of these things can’t be true. For example, a geocentrist would reject the theory since they “know” that the earth doesn’t orbit the sun, while a creationist would reject it outright because they know God created the earth not aliens. However, the geocentrist might start their own branch of the theory though, like the many branches of the JFK assassination theory. For example, I suspect that one reason the Moon landing conspiracy has so few adherents is that there are so many ideological positions that it violates, even ones that tend to be mutually exclusive demographically. For example, the American right are likely to reject it because it is anti-American, while the American left are likely to reject it because it is anti-science. If everything about the moon landing were identical but the Soviets had got there first, I suspect that the American right would embrace it at rates similar to young-earth creationism.
Hello, Less Wrong:
I have been lurking around LW for a while after finding it from links on MIRI or FHI. I’ve only recently begun to learn about Bayesian probability and inference on a practical level. I’m going through school for a bachelors in game programming. For now my primary focus is on the simplified AI currently used in gaming, but I believe that more sophisticated AI technologies like natural language parsing and more realistic behavioral simulations and problem solving will be useful in games in the near future. I work as a help desk tech where I get to experience the contrast of human irrationality and technological rationality on a daily basis.
I tend to be a devil’s advocate by nature, though I do not identify as a contrarian. I’ve learned to recognize assumptions, and try to spot them in myself as well as others and I do frequently re-evaluate and change longstanding intellectual and even political beliefs. I find that there must be a balance when advocating unpopular positions though, because if one alienates everyone by nitpicking the small stuff, by the time something important comes up one has already alienated everyone.
I grew up in the woods without electricity back in the 80s, but read everything I could get my hands on. This included many of the books my parents owned and everything that interested me at the local library. I think I learned to be rationalist by listening to my dad’s rants. For example, he supposed himself to be a free market conservative on one hand, but then he would get poor service from a company and get angry and yell “There ought to be a law!” Such things would make me shake my head and pledge to try never to be like that. To their credit though, my parents did encourage free-thinking and exploring divergent ideas. For example, but I was encouraged to read the Communist Manifesto. I keep meaning to read Das Kapital, because references to it that I’ve encountered make me suspect that it was written more for decision-makers, while the Manifesto seems more of a political handbook for the masses.
I feel that LW helps to reinforce my good habits and remind me to check my bad habits. I look forward to learning to more consistently practice these habits, and learn more about using Bayesian logic in life and my career.