Criticism of some popular LW articles

My composition teacher in college told me that in some pottery schools, the teacher holds up your pot, examines it, comments on it, and then smashes it on the floor. They do this for your first 100 pots.

In that spirit, this post's epistemic status is SMASH THIS POT.

As an experiment, I’m choosing three popular LW posts that happen to be at the top of my feed. I’m looking for the quotes I most disagree with, and digging into that disagreement. This is because I notice I have a tendency to just consume curated LW content passively. I’d instead like to approach it with a more assertively skeptical mindset.

Thought I believe in the principle of charity, I think that a comparative advantage of LW compared with other online spaces is as a space where frank and public object-level disagreement is permitted, as long as we avoid ad hominems or trolling.

Furthermore, I’m focusing here on what I took away from these posts, rather than what these authors intended, or what they learned by writing them. Insofar as the author was specifically trying to convince me to take these statements seriously, they failed. Insofar as they had some other purpose in mind, I have absolutely no opinion on the matter.

So with personal apologies to the three excellent writers I happened to select for this exercise, here I go.

1. The Mystery of the Haunted Rationalist

Quote I disagree with:

So although it’s correct to say that the skeptics’ emotions overwhelmed their rationality, they wouldn’t have those emotions unless they thought on some level that ghosts were worth getting scared about.

No. They had those emotions because they thought on some level that dark, unfamiliar environments that their community says are scary might be unsafe. Hence, reword “This looks suspiciously like I’m making an expected utility calculation. Probability of being killed by ghost something dangerous living in the house * value of my life, compared to a million dollars.”

Impact on conclusion:

But if that’s true, we’re now up to three different levels of belief. The one I profess to my friends, the one that controls my anticipation, and the one that influences my emotions.

There are no ghosts, profess skepticism.
There are no ghosts, take the bet.
There are ghosts, run for your life!

Instead of three levels of belief, we have three levels of storytelling. The story my friends and I share, the story that controls my anticipation, and the story that’s hardwired into my amygdala.

Wouldn’t it be fun to stay the night in a “haunted house?”
There’s most likely no danger, take the bet.
There’s a small but important possibility of danger, run for your life!

Scott’s conflating the motivations under which an two people of his background are likely to stay the night in a haunted house, with the circumstances under which two skeptics might have a serious need to disprove the existence of ghosts by staying the night in a haunted house. The Korean fear of fan death is not very much like the Euro-American fear of haunted houses, which obviously are not endorsed by local scientists or the government.

2. The Tallest Pygmy Effect

Quote I disagree with:

Tallest pygmy effects are fragile, especially when they are reliant on self-fulfilling prophecies or network effects. If everyone suddenly thought the Euro was the most stable currency, the resulting switch would destabilize the dollar and hurt both its value and the US economy as a whole.

This is begging the question. If everyone suddenly thought the Euro was the most stable currency, something dramatic would have had to have happened to shift the stock market’s assessment of the fundamentals of the US vs. EU economies and governments. Economies are neither fragile nor passive, and these kinds of mass shifts in opinion on economic matters don’t blow with the wind. Furthermore, people are likely to hedge their bets. If the US and EU currencies are similar in perceived stability, serious investors are likely to diversify.

“Tallest pygmy effect” is another term for “absolute advantage,” but with the added baggage of being potentially offensive, pumping our intuitions toward seeing institutions as human-like in scale, and being disconnected with the terms “comparative advantage” and “absolute” advantage, which are standard economic jargon and have useful tie-ins with widely available pedagogical material in that field.

Impact on conclusion:

We shouldn’t use the term “tallest pygmy effect,” and should be very skeptical of LessWrong takes on economic issues unless there’s strong evidence the presenter knows what they’re talking about. This updates me in the direction of popularity being a poor proxy for accuracy or usefulness.

3. Is Rhetoric Worth Learning?

Quote I disagree with:

Quote 1:

On LessWrong, people often make a hard distinction between being correct and being persuasive; one is rational while the other is “dark arts.”

No. On LessWrong, people use the term “dark arts” to refer specifically to techniques of persuasion that deliberately interfere with people’s truth-seeking abilities. Examples might include personal attacks such as shaming, ostracization, or threats; deliberate fallacious reasoning such as fort-and-field or Gish Gallop tactics; or techniques of obfuscation and straight-up lying.

Being persuasive isn’t “dark arts;” it’s just that good rhetoric is equally useful to anyone and is thus a symmetric weapon, one whose use doesn’t inherently help humanity progress toward truth.

Quote 2:

From a societal perspective, making any kind of improvement, at any scale above literally one-man jobs, depends on both correctness and persuasiveness. If you want to achieve an outcome, you fail if you propose the wrong method and if you can’t persuade anyone of the right method.

This is true, but rhetoric is only one small part of persuasion. Since this post is about rhetoric, but its importance is justified on the basis of the necessity of persuasion, I think this is a point that needs to be made.

Quote 3:

Some of the things I think go into talking well:

Emotional Skills

  • How to be aware of other people’s points of view without merging with them

  • How to dare to use a loud, clear voice or definite language

  • How to restrain yourself from anger or upset

[Etc.]

An analogous list might be:

Some of the things I think go into doing math well:

Adding skills

Knowing how to add small numbers in your head.

Knowing how to transform repeated addition into multiplication.

Knowing that the more positive numbers you add in the equation, you get a larger and larger output.

[Etc.]

While it’s true that someone who lacks the skills listed probably has some major shortcomings in the rhetoric or math departments, I think it’s unlikely that either of these lists is a useful decomposition of these skills. Both pump our intuitions in the direction of “if I practice these specific skills, I’ll get better at math.” Or “if I practice the skills on this list I intuitively think I’m bad at, I’ll get better at rhetoric overall.”

In fact, it’s not at all clear to me that the expected value of this list as a pedagogical tool to teach rhetoric is net positive or even gets across the basic idea that they author intended. It’s the kind of thing I want to keep my System 1 away from so that it doesn’t get sucked in and mislead my System 2.

Impact on conclusion:

Rhetoric might be worth learning, but there’s also a reason we have professional editors. Division of labor is important, and it’s not clear that really good rhetoric is that much better at persuasion than an oft-chanted slogan. In fact, it’s perfectly possible that good rhetoric and a correct argument are merely correlated, both caused by underlying general intelligence and sheer depth of study. It’s also possible that rhetoric is not a symmetric weapon, and that it’s easier to dress a correct idea in persuasive rhetoric than to so present an incorrect idea.

Hey—why do we all seem to assume that rhetoric is uncorrelated with truth, anyway?

Aristotle didn’t seem to think so:

Nevertheless, the underlying facts do not lend themselves equally well to the contrary views. No; things that are true and things that are better are, by their nature, practically always easier to prove and easier to believe in.

Assessment:

Respectively, I see these posts as featuring a misguided comparison, displaying a lack of scholarship and putting forth an unfounded assertion as a stylized fact, and meandering around on a topic rather than delivering on the promise implied by the title.

It feels intuitively true that our minds have ~separate systems for justifying our beliefs and changing our beliefs. Reading a post while looking for things to disagree and with the intention of stating those points of diagreement clearly feels different from my normal consumption patterns.

I notice myself feeling both nervous about a negative reaction toward my takes on these posts, and about the possibility that others might return the favor when they read my posts.

Overall, this experience leaves me with two equally concerning and compatible conjectures.

a. My reaction to rationalist content is governed by my frame of mind. If I read them seeking wisdom, then wisdom I shall find. If I read them to criticize, then I’ll find things to be critical of. Without some more formal structure in place, the nature of which I’m unaware, I am not able to “assess” content for correctness or usefulness. I can only produce positive or negative feedback. This reminds me of Eliezer’s piece Against Devil’s Advocacy, except that I’m less convinced than he seems to be that there’s such a thing as “true thinking” distinct from the production of rationalizations. Maybe if we knew what the truth was, we could measure how efficiently different modes of thinking would get us there. But that’s the problem, right? We don’t know the truth, don’t know the goal, and so it’s very hard to put a finger on this “true thinking” thing.

b. There is a lot of error-ridden content on LessWrong. The more true (a) is, then the more true I should expect (b) to be. And if error-ridden content can influence my frame of mind, then the more true (b) is, the more likely (a) is to be true as well. Reading LessWrong is like trying to learn a subject by reading student essays. It’s not a good strategy.

This experiment shifts me toward seeing LessWrong as closer to a serious-minded fan fic community than a body of scholarship. You can learn to write really well by producing fan fic. But you probably can’t learn to write well by reading primarily fan fic. Yet somebody needs to read the fan fic to make the writing of it a meaningful exercise. So reading and commenting on LessWrong is something we do as an altruistic act, in order to support the learning of our community. Posting on it, reading external sources, and inquiring into the conditions of our own lives and telling our own stories is how we transform ourselves into better thinkers.

I believe that reading and writing for LessWrong has made me a much better thinker and enhanced my leadership skills. It has my blessing. Scott Alexander and Elizabeth, my first two victims, are thinkers who I respect and whose writings and conversation I’ve found useful. Thanks also to sarahconstantin, whose body of writings I first read today as far as I know and whose thoughts on rhetoric I found interesting and insightful.