No Really, Why Aren’t Rationalists Winning?

Reply to Extreme Rationality: It’s Not That Great, Extreme Rationality: It could Be Great, the Craft and the Community and Why Don’t Rationalists Win?

I’m going to say something which might be extremely obvious in hindsight:

If LessWrong had originally been targeted at and introduced to an audience of competent business people and self-improvement health buffs instead of an audience of STEM specialists and Harry Potter fans, things would have been drastically different. Rationalists would be winning.

Right now, rationalists aren’t winning. Rationality helps us choose which charities to donate to, and as Scott Alexander pointed out in 2009 it gives clarity of mind benefits. However, as he also pointed out in the same article, rationality doesn’t seem to be helping us win in individual career or interpersonal/​social areas of life.

It’s been nearly ten years since then, and I have yet to see any sign that this fact has changed. I considered the possibility that I just hadn’t heard about other rationalists’ practical success due to having not become a rationalist until around 2015, or simply because no one was talking about their success. Then I realized that was silly. If rationalists had started winning, at least one person would have posted about it here on lesswrong.com. I recently spoke to Scott Alexander, and he said he still agreed with everything he said in his article.

So rationalists aren’t winning. Why not? The Bayesian Conspiracy podcast (if I recall correctly), proposed the following explanation in one of their episodes: that rationality can only help us improve a limited amount relative to where we started out. They predicted that people who started out at a lower level of life success/​cognitive functioning/​talent cannot outperform non-rationalists who started out at a sufficiently high level.

This argument is fundamentally a cop-out. When others win in places where we fail, it makes sense to ask, “How? What knowledge, skills, qualities or experience do they have which we don’t? And how might we obtain the same knowledge, skills, qualities or experience?” To say that others are simply more innately talented than we are, and leave it at that, doesn’t explain the mechanism behind their hypothesized greater rate of improvement after learning rationality. It tells us why but not how. And if there was such a mechanism, could we not replicate it so we could improve more anyway?

So why aren’t we winning? What’s the actual mechanism behind our failure?

It’s because we lack some of the skills we need to win—not because we don’t want to win, and not because we’re lazy.

Rationalists are very good at epistemic rationality. But there’s this thing that we’ve been referring to as “instrumental rationality” which we’re not so good at. I wouldn’t say it’s just one thing, though. Instrumental rationality seems like many different arts that we’re lumping together.

It’s more than that, though. We’re not just lumping together many different arts of rationality. As anyone who’s read the sequence A Human’s Guide to Words would know, categorization and labeling are not neutral actions for a human. By classifying all rationality as one of two types, epistemic or instrumental, we limit our thinking about rationality. As a result of this classification, we fail to acknowledge the true shape of rationality’s similarity cluster.

The cluster’s true shape is that of instrumental rationality: it is the art of winning, a.k.a. achieving your values. All rationality is instrumental, and epistemic rationality is merely one example of it. The art of epistemic rationality is how you achieve the value of truth. Up until now, “instrumental rationality” has been a catch-all term we’ve been using for the arts of winning at every other value.

While achieving the value of truth is extremely useful for achieving every other value, truth is still only one value among many. The skills needed to achieve other values are not the same as the skills needed to achieve the value of truth. That is to say, epistemic rationality includes the skill sets that are useful for obtaining truth and “instrumental rationality” includes all other skill sets.

Truth is a precious and valuable thing. It’s just not enough by itself to win in other areas of life.

That might seem obvious at face value. However, I’m not sure we understand that on a gut level.

I have the impression that many of us assume that so long as we have enough truth, everything else will simply fall into place—that we’ll do everything else right automatically without needing to really develop or practice any other skills.

Perhaps that would be the case with enough computing power. An artificial superintelligence could perhaps play baseball extremely well with the following method:

1. Use math to calculate where the particles in the bat, the ball, the air, and all the players are moving.

2. Predict which particles have to be moved to and from what positions in order to cause a chain reaction that results in the goal state. In this case, the goal state would be a particle configuration that humans would identify as a won game of baseball.

3. Move the key particles to the key positions. If you fail to reach the goal state, adjust your priors accordingly and repeat the process.

An artificial superintelligence could perhaps navigate relationships, or discover important scientific truths, or really anything else, all by this same method, provided that it had enough time and computing power to do so.

But humans are not artificial superintelligences. Our brains compress information into caches for easier storage. We will not succeed at life just by understanding particle physics, no matter how much reductionism we do. As humans, our beliefs are organized into categorical levels. Even if we know that reality itself is really all just one level, our brains don’t have the space to contain enough particle-level knowledge to succeed at life (assuming that particles really are the base level, but we’ll leave that aside for now). We need that knowledge compressed into different categorical levels or we can’t use it.

This includes procedural knowledge like “how many particles need to be moved to and from what positions to win a game of baseball”. If our brains were big enough to be capable of knowing that, then all we would need to do to win is to obtain that knowledge and then output the correct choice.

For an artificial superintelligence, once it has enough relevant knowledge, it would have all that it needs to make optimal decisions according to its values.

For a human, given the limits of human brains, having enough relevant knowledge isn’t the only thing needed to make better decisions. Having more knowledge can be extremely useful for achieving one’s other goals besides just knowledge for knowledge’s sake, but only if one has the motivation, skills and experience to leverage that knowledge.

Current rationalists are really good at obtaining knowledge, at least when we manage to apply ourselves. But we’re failing to leverage that knowledge. For instance, we ought to be dominating prediction markets and stock markets and outputting a disproportionately high number of superpredictors, to the point where other people notice and take an interest in how we managed to achieve such a thing.

In fact, betting in prediction markets and stock markets provides an external criteria for measuring epistemic rationality—just as martial arts success can be measured by the external criteria of hitting your opponent.

So why haven’t we been dominating prediction and stock markets? Why aren’t we dominating them right now?

In my own case, I’m still an undergraduate college student living largely off of my parents’ income. I can’t afford to bet on things since I don’t have enough money of my own for it, and my income is highly irregular and hard to predict so it’s difficult to budget things. I would need to explain the expense to my mother if I started betting. If I did have more money of my own, though, I definitely would be spending some of it on this. Do a lot of other people here have such extenuating circumstances? Somehow that would feel like too much of a coincidence.

It’s more likely to be because many of us haven’t learned the instrumental skills needed to get ourselves to go out and bet. Such skills might include time management to set aside time to go bet, or interpersonal/​communication skills to make sure the terms of the bets are clear and that we’re only betting against those who will abide by the terms once they’re set.

Prediction markets and stock markets aren’t the only opportunity that rationalists are failing to take advantage of. For example, our community almost entirely neglects public relations, despite its potential as a way to significantly increase staff and funds for the causes we care about by raising the sanity waterline. We need better interpersonal/​communication skills for interacting with the general public, and we need to learn to be more pragmatic so we will actually be able to get ourselves to do that instead of succumbing to an irrational deep-seated fear of appearing cultish.

Competent business people and self-improvement health buffs do have those skills. We don’t. That’s why we’re not winning.

In short, we need arts of rationality for the pursuit of values beyond mere truth. One of my friends who has read the Sequences has been spending years working on beginning to map out those other arts, and he recently presented his work to me. It’s really interesting. I hope you find it useful.

(Note: Said friend will be introducing himself on here and writing a sequence about his work later. When he does I will add the links here.)