I have several “local” nitpicks and a “global” objection to the overall narrative that I think is being proposed here. Local issues first.
Alexander concludes with a characteristically mistake-theoretic plea for mutual understanding
He does. But one thing you don’t mention which I think makes your presentation of SA’s argument misleading is this: between the things you’ve previously described (about Fox News, liberal academics, and the Bush/Quayle “revenue enhancements”, all things going on in the contemporary-ish USA) and the bit you go on to quote (referring to “a hostile environment that lies to them all the time) SA’s piece has moved on somewhat, and (1) the “hostile environment” bit is not talking about the contemporary US but about (a somewhat hypothetical version of) the Stalin-era USSR and (2) the business about interactions between “savvy” and “clueless” people is addressing a fundamentally different question from most of the article.
So (1) to whatever extent you’re taking SA’s article to say that the contemporary USA (or other similar places) is “a hostile environment that lies to us all the time”, I think that is an error (maybe SA would in fact agree with you about that, maybe not, but at any rate it isn’t what he says).
And (2) he isn’t saying we should necessarily regard the presenters on Fox News, or those liberal academics, or the writers in the Washington Post, or the government of the Stalinist USSR, as being honestly mistaken. That isn’t what the conflict/mistake dichotomy is about. When he talks about our attitudes to those people, he does so in terms of “they aren’t honest, but are they likely to be lying to me about this, in this particular way, in this particular context?”. The mistake-theory-ish bit you quote comes in only at the end and is about an entirely different question: how should we interact with people whose assessment of the honesty of what those would-be authorities are saying is different from ours?
We need a conflict theory to understand this type of situation [sc. cop-bribery].
“Conflict theory” and “mistake theory” don’t mean thinking that everyone all the time is or isn’t working towards the same goal. Obviously different people have different goals, sometimes opposing ones. The terms only make sense in the context of some sort of discussion (e.g., a political one) where the differences between you and your interlocutor may or may not be conflict-y or mistake-y. The bribing-a-cop scenario is not of this type, and “we need a conflict theory to understand this type of situation” seems to me like a category mistake.
(Remark: I think conflict/mistake oversimplifies in important ways. 1. We can have the same ultimate goals but still relate in conflict-y ways, if our differing opinions give us opposing instrumental goals and prospects for reaching agreement on those opinions are poor. 2. There are ways to have different goals that aren’t of the form “I want my group to be on top, you want your group to be on top” and while these may still lead to conflict I think it’s a fundamentally less-hostile sort of conflict.)
When a government denies tax increases but announces “revenue enhancements” [...] regime supporters and dissidents are trying to do different things. Dissidents want to create common knowledge of the regime’s shortcomings [...]
There are definitely situations where “dissidents are trying to create common knowledge of the regime’s shortcomings” so that when the right time comes everyone can have enough confidence to revolt. But SA’s example of “revenue enhancements” is unambiguously not one of those situations. One didn’t need any particular degree of common knowledge to not vote for George Bush. No one was proposing an armed revolt or anything similarly risky. Saying “aha, Bush did levy new taxes despite saying he wouldn’t” did not put one in danger of being “crushed as an individual”.
(This is a place where I think you are taking advantage of your earlier conflation of contemporary-US and Stalinist-USSR situations in SA’s article.)
Further, while the “revenue enhancements” thing is obviously slimy, it’s not remotely in the same category as e.g. the things in the “Kolmogorov complicity” article you link to. Saying that thunder is heard before the corresponding lightning is seen (SA’s example in that article) is flatly incompatible with reality; you can’t actually believe it along with the truth about how physics and thunderstorms work, but you can call a tax a “revenue enhancement” without any actual false beliefs about reality. (You probably can’t think that’s optimal terminology for good thinking without false beliefs, but most people most of the time are not choosing their terminology solely to optimize good thinking, and it’s not at all clear that they should.)
As for the overall narrative:
The impression I get from your article is something along the following lines: “SA is a mistake-theorist; he wants us to think of other people as basically on the same side as us, and reason politely with them. His article about bounded distrust applies this thinking to governments, major media sources, etc. But this is all wrong and possibly outright sinister: governments, major media sources, etc., are actively trying to mislead us for their own ends, and the people who want to think in mistake-theory terms in such a situation are the lackeys of Power, the government mouthpieces and suchlike, as opposed to the brave dissidents who see the conflict for what it is.” With a somewhat-plausibly-deniable side order of “Boooo to SA, who has shown himself to be on the side of Power, which is much like the government of the Stalinist USSR”.
And I think most of this narrative is wrong. SA is indeed a mistake-theorist, but he conspicuously doesn’t take that to mean that the mouthpieces of state/cultural/… power should be assumed to be arguing in good faith. His article about bounded distrust, in particular, doesn’t suggest doing that. I see no reason to think that his general preference for mistake theory indicates that he is on the side of Power (whatever specific sort of Power that might be). I do not think any sort of Power he is plausibly on the side of has much in common with the Stalinist USSR.
SA’s piece has moved on somewhat, and (1) the “hostile environment” bit is not talking about the contemporary US but about (a somewhat hypothetical version of) the Stalin-era USSR
It doesn’t seem to me like the setting of the illustrative examples should matter, though? The problem of bounded distrust should be qualitatively the same whether your your local authorities lie a lot or only a little. Any claims I advance about human rationality in Berkeley 2023 should also hold in Stalingrad 1933, or African Savanna −20,003, or Dyson Sphere Whole-Brain Emulation Nature Preserve 2133.
about an entirely different question: how should we interact with people whose assessment of the honesty of what those would-be authorities are saying is different from ours?
I think they’re related! The general situation is: agent A broadcasts claim K, either because K is true and A wants Society to benefit from knowing this, or because A benefits from Society believing K. Agents B and C have bounded distrust towards A, and are deciding whether they should believe K. B says that K doesn’t seem like the sort of thing A would lie about. From C’s perspective, this could be because it really is true that K isn’t the sort of thing that A would lie about—or it could be that A and B are in cahoots.
Section IV. of “Bounded Distrust” opens with the case where A = “credentialed experts”, K = “ivermectin doesn’t work for COVID”, B = “Scott Alexander”, and C = “Alexandros Marinos”. But the problem should be the same if A = “Chief Ugg”, K = “there’s a lion across the river”, or A = “the Dyson Sphere Whole-Brain Emulation Nature Preserve Tourism Board”, K = “Norton AntiVirus works for cyber-shingles”, &c.
The general problem is that agents with different interests sometimes have an incentive to distort shared maps, so it’s very naïve to say “it’s important for these two types of people to understand each other” as if differences in who one trusts were solely due to differences in map-correction skill (mistake theory), rather than differences in who one trusts to not distort shared maps to one’s own detriment (conflict theory).
(Thanks for commenting! You’re really challenging me to think about this more deeply. This post came about as a 20x wordcount expansion of a Tweet, but now that your criticism has forced me to generalize it, I’m a little worried that my presentation of the core rationality insight got “contaminated” by inessential details of my political differences with Scott; it seems like there should be a clearer explanation for my intuition that mistake theory corresponds with the “loyalist” rather than the “dissident” side of a conflict—something about how power can make contingent arrangements seem more “natural” than they really are?—and I’m not immediately sure how to make that crisp, which means my intuition might be wrong.)
I have several “local” nitpicks and a “global” objection to the overall narrative that I think is being proposed here. Local issues first.
He does. But one thing you don’t mention which I think makes your presentation of SA’s argument misleading is this: between the things you’ve previously described (about Fox News, liberal academics, and the Bush/Quayle “revenue enhancements”, all things going on in the contemporary-ish USA) and the bit you go on to quote (referring to “a hostile environment that lies to them all the time) SA’s piece has moved on somewhat, and (1) the “hostile environment” bit is not talking about the contemporary US but about (a somewhat hypothetical version of) the Stalin-era USSR and (2) the business about interactions between “savvy” and “clueless” people is addressing a fundamentally different question from most of the article.
So (1) to whatever extent you’re taking SA’s article to say that the contemporary USA (or other similar places) is “a hostile environment that lies to us all the time”, I think that is an error (maybe SA would in fact agree with you about that, maybe not, but at any rate it isn’t what he says).
And (2) he isn’t saying we should necessarily regard the presenters on Fox News, or those liberal academics, or the writers in the Washington Post, or the government of the Stalinist USSR, as being honestly mistaken. That isn’t what the conflict/mistake dichotomy is about. When he talks about our attitudes to those people, he does so in terms of “they aren’t honest, but are they likely to be lying to me about this, in this particular way, in this particular context?”. The mistake-theory-ish bit you quote comes in only at the end and is about an entirely different question: how should we interact with people whose assessment of the honesty of what those would-be authorities are saying is different from ours?
“Conflict theory” and “mistake theory” don’t mean thinking that everyone all the time is or isn’t working towards the same goal. Obviously different people have different goals, sometimes opposing ones. The terms only make sense in the context of some sort of discussion (e.g., a political one) where the differences between you and your interlocutor may or may not be conflict-y or mistake-y. The bribing-a-cop scenario is not of this type, and “we need a conflict theory to understand this type of situation” seems to me like a category mistake.
(Remark: I think conflict/mistake oversimplifies in important ways. 1. We can have the same ultimate goals but still relate in conflict-y ways, if our differing opinions give us opposing instrumental goals and prospects for reaching agreement on those opinions are poor. 2. There are ways to have different goals that aren’t of the form “I want my group to be on top, you want your group to be on top” and while these may still lead to conflict I think it’s a fundamentally less-hostile sort of conflict.)
There are definitely situations where “dissidents are trying to create common knowledge of the regime’s shortcomings” so that when the right time comes everyone can have enough confidence to revolt. But SA’s example of “revenue enhancements” is unambiguously not one of those situations. One didn’t need any particular degree of common knowledge to not vote for George Bush. No one was proposing an armed revolt or anything similarly risky. Saying “aha, Bush did levy new taxes despite saying he wouldn’t” did not put one in danger of being “crushed as an individual”.
(This is a place where I think you are taking advantage of your earlier conflation of contemporary-US and Stalinist-USSR situations in SA’s article.)
Further, while the “revenue enhancements” thing is obviously slimy, it’s not remotely in the same category as e.g. the things in the “Kolmogorov complicity” article you link to. Saying that thunder is heard before the corresponding lightning is seen (SA’s example in that article) is flatly incompatible with reality; you can’t actually believe it along with the truth about how physics and thunderstorms work, but you can call a tax a “revenue enhancement” without any actual false beliefs about reality. (You probably can’t think that’s optimal terminology for good thinking without false beliefs, but most people most of the time are not choosing their terminology solely to optimize good thinking, and it’s not at all clear that they should.)
As for the overall narrative:
The impression I get from your article is something along the following lines: “SA is a mistake-theorist; he wants us to think of other people as basically on the same side as us, and reason politely with them. His article about bounded distrust applies this thinking to governments, major media sources, etc. But this is all wrong and possibly outright sinister: governments, major media sources, etc., are actively trying to mislead us for their own ends, and the people who want to think in mistake-theory terms in such a situation are the lackeys of Power, the government mouthpieces and suchlike, as opposed to the brave dissidents who see the conflict for what it is.” With a somewhat-plausibly-deniable side order of “Boooo to SA, who has shown himself to be on the side of Power, which is much like the government of the Stalinist USSR”.
And I think most of this narrative is wrong. SA is indeed a mistake-theorist, but he conspicuously doesn’t take that to mean that the mouthpieces of state/cultural/… power should be assumed to be arguing in good faith. His article about bounded distrust, in particular, doesn’t suggest doing that. I see no reason to think that his general preference for mistake theory indicates that he is on the side of Power (whatever specific sort of Power that might be). I do not think any sort of Power he is plausibly on the side of has much in common with the Stalinist USSR.
It doesn’t seem to me like the setting of the illustrative examples should matter, though? The problem of bounded distrust should be qualitatively the same whether your your local authorities lie a lot or only a little. Any claims I advance about human rationality in Berkeley 2023 should also hold in Stalingrad 1933, or African Savanna −20,003, or Dyson Sphere Whole-Brain Emulation Nature Preserve 2133.
I think they’re related! The general situation is: agent A broadcasts claim K, either because K is true and A wants Society to benefit from knowing this, or because A benefits from Society believing K. Agents B and C have bounded distrust towards A, and are deciding whether they should believe K. B says that K doesn’t seem like the sort of thing A would lie about. From C’s perspective, this could be because it really is true that K isn’t the sort of thing that A would lie about—or it could be that A and B are in cahoots.
Section IV. of “Bounded Distrust” opens with the case where A = “credentialed experts”, K = “ivermectin doesn’t work for COVID”, B = “Scott Alexander”, and C = “Alexandros Marinos”. But the problem should be the same if A = “Chief Ugg”, K = “there’s a lion across the river”, or A = “the Dyson Sphere Whole-Brain Emulation Nature Preserve Tourism Board”, K = “Norton AntiVirus works for cyber-shingles”, &c.
The general problem is that agents with different interests sometimes have an incentive to distort shared maps, so it’s very naïve to say “it’s important for these two types of people to understand each other” as if differences in who one trusts were solely due to differences in map-correction skill (mistake theory), rather than differences in who one trusts to not distort shared maps to one’s own detriment (conflict theory).
(Thanks for commenting! You’re really challenging me to think about this more deeply. This post came about as a 20x wordcount expansion of a Tweet, but now that your criticism has forced me to generalize it, I’m a little worried that my presentation of the core rationality insight got “contaminated” by inessential details of my political differences with Scott; it seems like there should be a clearer explanation for my intuition that mistake theory corresponds with the “loyalist” rather than the “dissident” side of a conflict—something about how power can make contingent arrangements seem more “natural” than they really are?—and I’m not immediately sure how to make that crisp, which means my intuition might be wrong.)