I think you misunderstand my concern; perhaps I have not been clear enough. I am not so much worried about equivocation, as I am worried by precisely the 3-step process which you describe. And I am particularly worried about people going through that process, labelling something a “conspiracy theory,” then the theory turns out to be true, and they never reassess their premises.
Let’s restate your process in more neutral language.
For reasons of specialisation, partial information, etc, I treat the fact that lots of people believe in a theory as partial evidence in its favour.
Some people have a higher prior than me for the existence of conspiracies.
Therefore if a theory involving a conspiracy is believed by quite a lot of people, it may be that this belief is due to their higher prior for conspiracies, not any special knowledge or expertise that I need to defer to.
Therefore I treat their belief in the theory as less evidence than normal, on the basis that if I had the evidence/expertise/etc that they do, I would be less likely than them to conclude that there is a conspiracy.
So if someone offers an implausible-seeming theory to me involving a conspiracy, I discount it and conclude that its advocates just have a high prior for conspiracies.
Suddenly, this doesn’t look like a sound epistemological process at all. Steps (1) and (2) are fine, but (3), (4) and (5) go increasingly off the rails. It looks like you are deliberately shielding your anti-conspiracies prior, by discounting (even beyond their initial level of plausibility) theories that might challenge it. And if, on those occasions that a conspiracy is eventually proven, you refuse to update your prior on the likelihood of conspiracies (by insisting that such-and-such a theory doesn’t really count as a conspiracy theory, even though, at the time, you were happy to label it as such), then I would say that the process has become truly pathological, just as much as that of a “conspiracy theorist.”
Consider: why do some people have that higher prior? Mightn’t that higher prior be itself part of their tacit knowledge and expertise—in the same way that a doctor’s prior on the cause of a set of symptoms is not because he ‘likes’ to diagnose people with tuberculosis, but due to his own updates on past experience to which you are not privy. Aren’t we doing precisely the wrong thing by discounting the theory in response to the prior?
None of this means that you should become a 9-11 Truther, of course. But consider the 1999 Moscow bombings. I don’t have any particular evidence about the events, but there’s a plausible case that they were an FSB conspiracy. Shouldn’t that make you more willing to believe the hypothetical in the grandparent than otherwise? In my own experience, people who are most likely to believe in conspiracy theories are those who had their formative experiences in dictatorial countries where there really are lots of conspiracies—and so they subsequently see them everywhere. But by symmetry, it follows that those of us brought up in the West will be too reluctant to see conspiracies elsewhere.
(I’m not sure that #2 is the right formulation. A lot of people don’t think in terms sufficiently close to Bayesian inference that talking about their “priors” really makes sense. I’m not sure this is more than nit-picking, though.)
I agree that #3,4,5 “go increasingly off the rails” but I think what goes off the rails is your description, as much as the actual mental process it aims to describe. Specifically, I think you are making the following claims and blaming them on the term “conspiracy theory”:
That when someone thinks something is a “conspiracy theory” they discount it not only in the sense of thinking it less likely than they otherwise would have, but in the stronger sense of dismissing it completely.
That they are then immune to further evidence that might (if they were rational) lead them to accept the theory after all.
That if the theory eventually turns out to have been right, they don’t update their estimate for how much to discount theories on account of being suspiciously conspiracy-based.
Now, I dare say many people do do just those things. After all, many people do all kinds of highly irrational things. But unless I’m badly misreading you, you are claiming specifically that I and Brillyant do them, and you are laying much of the blame for this on the usage of the term “conspiracy theory”, and I think both parts of this are wrong.
Mightn’t that higher prior be itself part of their tacit knowledge and expertise
Yup. But the answer to that question is always yes, and therefore tells us nothing. (Mightn’t a creationist’s higher prior on the universe being only 6000 years old be part of their tacit knowledge and expertise? It might be, but I wouldn’t bet on it.)
But by symmetry, it follows that those of us brought up in the West will be too reluctant to see conspiracies elsewhere.
I don’t think the symmetry is quite there. People brought up in totalitarian countries who then move to liberal democracies see too many conspiracies. No doubt people brought up in liberal democracies who then move to totalitarian countries see too few, but it could still be that people brought up in totalitarian countries who stay there and people brought up in liberal democracies who stay there both see approximately the right number of conspiracies.
I think you misunderstand my concern; perhaps I have not been clear enough. I am not so much worried about equivocation, as I am worried by precisely the 3-step process which you describe. And I am particularly worried about people going through that process, labelling something a “conspiracy theory,” then the theory turns out to be true, and they never reassess their premises.
Let’s restate your process in more neutral language.
For reasons of specialisation, partial information, etc, I treat the fact that lots of people believe in a theory as partial evidence in its favour.
Some people have a higher prior than me for the existence of conspiracies.
Therefore if a theory involving a conspiracy is believed by quite a lot of people, it may be that this belief is due to their higher prior for conspiracies, not any special knowledge or expertise that I need to defer to.
Therefore I treat their belief in the theory as less evidence than normal, on the basis that if I had the evidence/expertise/etc that they do, I would be less likely than them to conclude that there is a conspiracy.
So if someone offers an implausible-seeming theory to me involving a conspiracy, I discount it and conclude that its advocates just have a high prior for conspiracies.
Suddenly, this doesn’t look like a sound epistemological process at all. Steps (1) and (2) are fine, but (3), (4) and (5) go increasingly off the rails. It looks like you are deliberately shielding your anti-conspiracies prior, by discounting (even beyond their initial level of plausibility) theories that might challenge it. And if, on those occasions that a conspiracy is eventually proven, you refuse to update your prior on the likelihood of conspiracies (by insisting that such-and-such a theory doesn’t really count as a conspiracy theory, even though, at the time, you were happy to label it as such), then I would say that the process has become truly pathological, just as much as that of a “conspiracy theorist.”
Consider: why do some people have that higher prior? Mightn’t that higher prior be itself part of their tacit knowledge and expertise—in the same way that a doctor’s prior on the cause of a set of symptoms is not because he ‘likes’ to diagnose people with tuberculosis, but due to his own updates on past experience to which you are not privy. Aren’t we doing precisely the wrong thing by discounting the theory in response to the prior?
None of this means that you should become a 9-11 Truther, of course. But consider the 1999 Moscow bombings. I don’t have any particular evidence about the events, but there’s a plausible case that they were an FSB conspiracy. Shouldn’t that make you more willing to believe the hypothetical in the grandparent than otherwise? In my own experience, people who are most likely to believe in conspiracy theories are those who had their formative experiences in dictatorial countries where there really are lots of conspiracies—and so they subsequently see them everywhere. But by symmetry, it follows that those of us brought up in the West will be too reluctant to see conspiracies elsewhere.
(I’m not sure that #2 is the right formulation. A lot of people don’t think in terms sufficiently close to Bayesian inference that talking about their “priors” really makes sense. I’m not sure this is more than nit-picking, though.)
I agree that #3,4,5 “go increasingly off the rails” but I think what goes off the rails is your description, as much as the actual mental process it aims to describe. Specifically, I think you are making the following claims and blaming them on the term “conspiracy theory”:
That when someone thinks something is a “conspiracy theory” they discount it not only in the sense of thinking it less likely than they otherwise would have, but in the stronger sense of dismissing it completely.
That they are then immune to further evidence that might (if they were rational) lead them to accept the theory after all.
That if the theory eventually turns out to have been right, they don’t update their estimate for how much to discount theories on account of being suspiciously conspiracy-based.
Now, I dare say many people do do just those things. After all, many people do all kinds of highly irrational things. But unless I’m badly misreading you, you are claiming specifically that I and Brillyant do them, and you are laying much of the blame for this on the usage of the term “conspiracy theory”, and I think both parts of this are wrong.
Yup. But the answer to that question is always yes, and therefore tells us nothing. (Mightn’t a creationist’s higher prior on the universe being only 6000 years old be part of their tacit knowledge and expertise? It might be, but I wouldn’t bet on it.)
I don’t think the symmetry is quite there. People brought up in totalitarian countries who then move to liberal democracies see too many conspiracies. No doubt people brought up in liberal democracies who then move to totalitarian countries see too few, but it could still be that people brought up in totalitarian countries who stay there and people brought up in liberal democracies who stay there both see approximately the right number of conspiracies.