Making Expertise Legible: Being right should make you respected, not the other way around

Link post

I will be hopping on a long train of thought largely already fleshed out by Scott Alexander and Zvi. The problem they are talking about is complicated, and so I recommend reading those linked articles, but for those with little time or poor memory I will briefly summarize.

Political appointments and government bureaucracies are selective systems; behaviors that reinforce the power of people above and around you usually get rewarded with promotions and more power. (This is Zvi’s concept of Immoral Mazes). The CDC is such a system. SA argues that this is one reason why the rationalist blogosphere has recognized that information from folks like Zvi about the Coronavirus is generally more accurate and useful than information from authoritative figures like Dr. Fauci. According to Scott Alexander, Zvi can optimize for “being right,” but:

When the Director of the CDC asserts an opinion, she has to optimize for two things—being right, and keeping power. If she doesn’t optimize for the second, she gets replaced as CDC Director by someone who does. That means she’s trying to solve a harder problem than Zvi is, and it makes sense that sometimes, despite having more resources than Zvi, she does worse at it.

Zvi’s response proposed that the situation is actually worse than this; he thinks that there’s not even any attempt at optimizing some combination of being right and keeping power, the desire to do good for good’s sake being trained out of such people long ago. Instead, people in high-level positions act out of a learned instinct to preserve their position.

Scott Alexander added a bonus post indicating that part of the problem of why “real” expertise doesn’t reach the policy level is that journalists can’t find insider sources willing to be contrarian, and that outsider sources simply don’t met the standard to be cited in serious articles. In addition, many experts don’t trust the journalists to faithfully reproduce their real opinions instead of writing hatchet jobs. I’d like to focus on these two problems. If we can improve either of them, and get more contrarian opinions taken seriously (when they’re right, ideally), that can only help put pressure on institutions to improve.

I. A Journalism-Side Solution

It would be good if journalists could find and honestly portray contrarian opinions, and if contrarians could trust they won’t be misrepresented.

No newspaper or online magazine seems to be able to keep the trust of outsiders for very long, and the recent debacle with Jordan Peterson makes me even more convinced that some journalists are especially motivated to discredit outsiders at every opportunity. But the growth in long-form podcast interviews, as well as platforms like Substack, shows that a demand for outsider voices exists. I think media organizations might be leaving money on the floor when they allow their journalists to burn bridges with contrarians. If sufficiently well-established people in journalism became convinced of this, or a new organization arose with a reputation for not straw-manning everyone, we might see a healthier public dialogue that results in saner policy being adopted faster. I don’t have any actionable ideas about how to do this but would be interested in hearing them.

II. An Expert-Side Solution

It would be good if people who are Smart and Correct About Things were taken more seriously, and if authorities had stronger incentive to not be intentionally Stupid and Incorrect About Things. I have some half-baked ideas on a way this could be approached.

I recently downloaded a browser extension from Ground News. Ground News is an app for fighting political echo chambers. It works by identifying the perspective of an article I’m reading and recommending pieces on the same topic from other political perspectives. It also identifies what it calls “blindspots,” stories that are disproportionately focused on by the right or left (for example, a story about an islamist terror attack in a Middle Eastern nation might be ignored by left-wing media while many right-wing sources address it; the opposite for stories about the Trump family using campaign donations to pay off their own debts).

I can imagine a version of this centered on expert sources and institutions. Say Dr. Anthony Fauci is quoted in a news article: a tooltip appears indicating his tendency to agree or disagree with major institutions or other experts in his field. If he’s made specific predictions, the tool can show how accurate he’s been, and so on.

Independently, public figures could be catalogued and rated on a website similar in concept to Rate My Professor. Each figure could have their own page with a brief summary of positions they’ve publicly held, how often they’ve changed their mind, whether or not they agree with centralized opinions like those of the CDC or if the big organizations eventually come around to positions they were early adopters of, etc. With the participation of the experts themselves, it can even become a sort of Celebrity-League PredictIt; experts who wish to improve their reputations could do so by making bold contrarian claims that turn out to be true.

To some extent I think I am just reinventing Wikipedia with Politifact smushed inside. But a well-made machine learning system could automate the process of linking together people who write or speak about similar issues, and identifying and operationalizing claims they make. Given enough participation and interest, we can make the insiders’ reputations (and job security) depend more on being right as early as possible, as outsiders beat them to the obvious conclusions over and over again.

A public record of who made the right call and when on issues of public import like Coronavirus might help bring public officials who operate on simulacrum level 3 towards level 2 (or maybe even 1, may we be so blessed).

I am looking forward to any reactions.