Paper is True

Link post

Postscript to: Rock is Strong

Always choosing Rock, broadly construed, is freerolling on the efforts of others and/​or you are lying. You are in at some important sense defecting. The problem of an information cascade is real. You are not contributing new true information, instead reinforcing the information and heuristics already out there.

Often this is the most accurate option available to you, or at least the most efficient. Doing the work yourself is hard. Many can’t do better that way and even for those who can it is a lot of work. On average those who disagree are more wrong and end up doing worse. Other times, accuracy is not what is desired. None of that makes it easy to pick the correct rock, but often it is straightforward enough.

The defection that comes from not doing your part for the long term epistemic commons often is made up for not only by your lack of a practical option to do otherwise, but also in some cases by the gains from accuracy.

Rewarding those who worship the rock, like rewarding any form of defection, is also defecting. The Queen causes everyone to die in a volcanic eruption because she set up bad incentives and is encouraging everyone to spend down the commons.

This can also be thought of partially as explore versus exploit. The majority of actions should be exploit, and most people should mostly be exploiting. There is more variance in ability to explore than exploit, so most people should explore even less than average and let others explore and report back. Exploration is a public good, so it needs to be rewarded beyond its private payoff or there won’t be enough of it. When competition is sufficiently intense or other factors reward exploitation too much, exploration can die out entirely.

This is the context for a Marginal Revolution post that nagged at me enough to give me enough initial motivation to write this whole thing. Quoted in full, and this is not about the examples it is about the method:

No, you don’t always have to agree with the majority of the educated people, but I would say this. For whatever set of views you think is justified, try to stick to the versions of those views held by well-educated, reasonable, analytically-inclined people. You will end up smarter over time, and in better places. Peer effects are strong, including across your ideological partners.

When I hear that a particular group defends liberty, such as the Ottawa truckers’ convoy, while this is partially true it makes me nervous. As a whole, they also seem to believe a lot of nonsense and to be, in procedural terms, not exactly where I would want them on scientific method and the like. Fair numbers of them seem to hold offensive beliefs as well. Whine about The Guardian if you like, but I haven’t seen any rebuttal of this portrait of the views of their leaders. Ugh.

I recall taking a lot of heat for my 2007 critique of Ron Paul and his movement, but that example illustrates my points perfectly. Those people did defend liberty in a variety of relevant ways, but so many of them have ended up in worse spaces. And that is exactly what I predicted way back when.

Look for strong analytical abilities, and if you don’t see it, run the other way.

Here is a defense of the Freedom Convoy. You can read it for yourself, but it doesn’t change my mind. Here is I think a wiser account. I’ll say it again: “Look for strong analytical abilities, and if you don’t see it, run the other way.” I’m running.

When I saw this, it set off loud alarm bells in my head. Something felt deeply wrong.

Essentially it was saying that if the Wrong Kind of People, who expressed Wrong Views that oppose Narrative, were advocating or backing something, you needed to run away even if they were right this time. Either the Covid restrictions need to be lifted or they don’t. ‘The people who advocate lifting restrictions hold offensive other views’ doesn’t even have an obvious sign on its direction in terms of what it makes more likely to be true. On the one hand, you could argue those people generally have worse epistemics. On the other hand, there are a lot of offensive views that are true, so anyone who has no offensive views also does not have great epistemics. If the objection is that such folks let others know they hold offensive views, that too has its impacts on epistemics.

And since the argument ‘people who are known to hold offensive views hold this view’ is used to argue this view too is offensive or wrong, there is an obvious bias in the discourse against such views. So even if you think that offensive-view-holders are more often wrong about other things, you need to ask if the marketplace of ideas is updating too much versus not enough on that information before you too can update. Right now, it seems like there is too strong a bias against offensive-view-holders, stronger than is justified when seriously examining their other views, and that this is causing many non-offensive views such folks often hold to be wrongly discounted, and sometimes causing them to become offensive views.

Thus it seems more like ‘offensive-view-holders hold this view, therefore this view will likely become offensive even if it is true and/​or holding this view will cause others to lower their status of you because they think you hold offensive views’ and advice to therefore run the other way. With a side of ‘if you hold this view it may cause you to adapt other offensive views because of the associations.’

Which is not bad advice on that level, if you care about such things, but as a Kantian imperative it gives far too much power to those who decide what is offensive and to how things look and sound, and too little power to truth.

A similar thing can be thought about Ron Paul. It does seem true that many who supported Ron Paul now are in worse spaces. Even given that this was predictable, why should it bear on the question of the quality of the ideas of Ron Paul? His statements are either true or false, his proposals worthwhile or otherwise, and ‘who supports X’ being a way to judge whether you should support X seems like letting coalitional politics (simulacra level 3 considerations) outweigh physical world modeling (simulacra level 1 considerations). Which always gives me a giant pit of horror.

Yes, there is a reasonable counter that the OP here is simply saying to stick to the smarter versions of these ideas, but that effectively translates to suggesting in-practice opposition to the practical versions that are on offer. Ron Paul.

And also that specific advice is also worded in a burn-the-commons kind of way that raises my alarms: “For whatever set of views you think is justified, try to stick to the versions of those views held by well-educated, reasonable, analytically-inclined people.”

This reads once more as a ‘do not attempt to think for yourself and decide what is true, instead rely on the opinions of others’ although at least there is room to evaluate potential others a bit. Whereas if you are aware and smart enough to figure out who such people are, it seems like you are also aware and smart enough to be one of them.

The argument that one should choose views for their peer effects scares me. I agree that peer effects of this sort are real, but going down the road where one chooses to believe things for that reason seems terrifying with lots of obvious downsides. A person doing so too seriously, in a real sense, does not have a mind.

And all of this seems to entwine the questions of who should be supported or opposed or lowered or raised in status with the question of what beliefs one should hold generally, instead of keeping them distinct, perhaps on the belief that most people cannot keep those distinct and it is foolish to suggest that they try.

One could however flip this. The above is suggesting that the OP calls for the use of a rock, but its central warning is the opposite.

It is saying beware those who are worshiping rocks, for they worship too strongly.

It is especially saying those who worship rocks that are already giving some crazy answers now, are going to give increasingly worse answers over time. Do not hitch your wagon to them.

If you are a strong analytical thinker, what you very much are not is a rock. You are not following a simple heuristic, or will know when to disregard it.

Ron Paul had a mix of good and bad ideas. A more thoughtful and analytical libertarian would also have had a mix of good and bad ideas. One difference would hopefully be a better mix with more good ideas and less bad ideas. The more salient difference here is the decision algorithm generating Ron Paul’s ideas. Thus, even if he happens to have a lot of ideas you agree with, when you see his other ideas or he generates new ones, they’re unlikely to be as good. The moment his ‘FREEDOM!’ rock goes wrong you’re in a lot of trouble, and you know this because you already have examples, unless one disagrees and thinks he was doing something better. That goes double for those whose rock said “RON PAUL!”

One could also say at least Ron Paul has a rock, and thus has predictable thoughts that are roughly consistent and are not corrupted by various political considerations as much as those of his rivals. Whereas the alternatives are much worse than rocks. Or alternatively, that the other candidates all have rocks that say “DO WHAT GETS YOU ELECTED” and you’d prefer an actual analytical thinker but you’ll take “FREEDOM!” over “DO WHAT GETS YOU ELECTED” any day.

Thus, one could evaluate the Convoy in a similar fashion, and assume that things are bound to go off the rails regardless of whether you agree with where they start out.

It is also a good example of how having a rock, in this case perhaps again that says “FREEDOM!” provides a simple heuristic but one that is not terribly useful. Like any good rock it can be interpreted any number of ways and depends on the context. Somehow the output of this rock became blocking freedom of movement. Most heuristics that seem simple are not at core simple, you still have to massage the data into the correct format and that is trickier than it may sound and often messed up.

It would certainly be a mistake to buy into a highly error-prone intellectual package, once one notices that is full of nonsense. One would not want to become a ‘fellow traveler’ and try to get into Aumann agreement with them or anything, or otherwise link your status wagon to it in the bigger picture, but neither of those need not be the relevant standard.

No wagons need be hitched in order to evaluate what is true. Nor should it much matter that some other solution is first best. Not when evaluating claims and ideas. One needs to hold true to the Litany of Tarski, or else paper stops beating rock. Then rock will truly be strong.

Yet sometimes, at least within a given magisterium, one has to effectively write some name down on a rock, even if it does not fully substitute for your brain. If there is no simple heuristic that will work, making sure there is a non-rock at the end of the chain does seem wise.