Instrumental Rationality 7: Closing Disclaimer

[Instrumental Rationality Sequence 77]

[A disclaimer that instrumental rationality as presented in this sequence is incomplete. Your feelings are also important! Pay attention to them.]

After reading through this sequence, you might be feeling very excited to go out and try some of this stuff out. You might think about imposing this instrumental rationality stuff upon many areas of your life.

And for that, I have a major cautionary warning.

Rationality can be dangerous because it’s an ontology. And while it’s not the One Ontology to Rule Them All, it can often feel like that when you’re within the rationality framing.

Here’s an analogy:

Owen is a person who wants to get work done, but often finds himself playing video games. He also feels bad about doing so because it doesn’t fit in with his self-image. Maybe there’s also something here about society has shaped his values, but the actual root cause isn’t that important. The main point is that some part of him likes playing the games.

So he’s following his intuitive feelings, but also there’s guilt somewhere in the system.

Now let’s say he bumps into instrumental rationality—planning, habits, motivation—the whole package.

What I’ve presented in this sequence is a way of looking at things, kind of like a set of special glasses.

Once Owen puts on these glasses, he starts to see new opportunities to use his shiny new techniques, like TAPs, to try and remove his video-game-playing habit. But note that the very idea of techniques, of transforming concepts into concrete actionables, is only something he sees when he puts on his Rationality Glasses.

My worry is when people use rationality as the lens through which they view the world for so long that they forget that there’s something important that’s hidden underneath the Rationality Glasses layer. Owen might just end up thinking that the Rationality Glasses show how the world is, rather than merely a useful way of looking at the world.

And instrumental rationality, or at least the way that I’ve presented it here, will have its own ideological biases. This isn’t necessarily bad; it’s a necessary consequence of any way of looking at the world. There’ll always be implicit values for any system you choose to use.

For rationality, these values are about highly striving for things like Optimization and Self-Improvement. I worry that this implicit valuation can be taken in a very wrong way. I see a failure mode where everything that doesn’t directly contribute to Optimization is seen as a “bad” thing which needs to be removed.

Owen might then see his video-game-habit as something foreign and “bad” rather than a poorly understood part of himself. So when Owen tries to use rationality to forcibly remove those “disobedient” parts of system, I think something quite terrible is very happening because he’s smothering vital parts of himself.

Just because those parts of yourself conflict with your “stated” values doesn’t mean they’re wrong. It’s important to recognize that many apparently “bad” parts of yourself also have good intentions.

After all, these were parts of himself that Owen had listened to prior to encountering rationality. They might be hidden under the current framing, but they’re still important.

It’s a little like if I just handed you an instruction manual for the human mind, but with a bunch of the pages missing.

There’s a lot that you’d now know about how things in the mind work, but if you just follow those directions, you wouldn’t get the whole story. There will be functions and knobs that’ll also be important which you wouldn’t know about. If you only follow the manual and don’t trust your own sense of what else is critical, then you’re in trouble.

Sooner or later, something will break, and troubleshooting will be very, very difficult.

This is why I think it’s important to respect those “useless” (from the Rationality Glasses POV) parts of yourself. Just because you can’t see a part’s function doesn’t imply there isn’t one.

The best solution, as far as I can tell, is something like being able to take off the Rationality Glasses to get in touch with those gut, instinctual, and quiet parts of yourself. You need to be able to step away from the rationality virtues of Optimization and just accept all the parts of yourself.

When you stop trying to cut off or suppress different parts of yourself, something very different happens.

You get a shift where you…Just Do Things.

Motivation and willpower, for example, end up just seeming like largely incoherent words. You’ll have different tastes, but suddenly the question of “How can I force myself to do/​not-do X?” just becomes irrelevant.

When you start integrating those quiet parts of yourself, you’re somehow more in-control, even thought you’re incorporating more dissent. I know it sounds sily on the surface. But there’s something very good that’s happening here, I think, when you allow all parts of yourself to be fulfilled.

This, of course, is a critique of instrumental rationality as I’ve presented it throughout this sequence. Others have taken the field farther.

A more sophisticated theory of instrumental rationality, like some of CFAR’s curriculum, might be more about communication, focusing on ways to integrate and dialogue between the explicit parts of yourself (which the Rationality Glasses endorse) and the implicit parts of yourself (which might not be endorsed, but are important nonetheless).


So, as you venture forth to try out things, just remember that the set of glasses you’ve got is incomplete.

And sometimes, you can see even more clearly without them.