Rationalists, Post-Rationalists, And Rationalist-Adjacents

Epistemic status: Hortative. I’m trying to argue for carving reality at a new joint.

I think it’s lovely and useful that we have labels, not just for rationalist, but for rationalist-adjacent and for post-rationalist. But these labels are generally made extensionally, by pointing at people who claim those labels, rather than intensionally, by trying to distill what distinguishes those clusters.

I have some intensional definitions that I’ve been honing for a long time. Here’s the biggest one.

A rationalist, in the sense of this particular community, is someone who is trying to build and update a unified probabilistic model of how the entire world works, and trying to use that model to make predictions and decisions.

By “unified” I mean decompartmentalized- if there’s a domain where the model gives two incompatible predictions, then as soon as that’s noticed it has to be rectified in some way.

And it’s important that it be probabilistic- it’s perfectly consistent to resolve a conflict between predictions by saying “I currently think the answer is X with about 60% probability, and Y with about 25% probability, and with about 15% probability I’m missing the correct option or confused about the nature of the question entirely”.

The Sequences are aimed at people trying to do exactly this thing, and Eliezer focuses on how to not go horribly wrong in the process (with a special focus on not trusting one’s own sense of obviousness).

Being a rationalist isn’t about any specific set of conclusions- it’s not about being an effective altruist, or a utilitarian, or even an atheist. It’s about whether one is trying to do that thing or not. Even if one is doing a terrible job of it!

Truth-seeking is a prerequisite, but it’s not enough. It’s possible to be very disciplined about finding and assembling true facts, without thereby changing the way one thinks about the world. As a contrast, here’s how the New York Times, whose fact-checking quality is not in dispute, decides what to report:

By and large, talented reporters scrambled to match stories with what internally was often called “the narrative.” We were occasionally asked to map a narrative for our various beats a year in advance, square the plan with editors, then generate stories that fit the pre-designated line.

The difference between wielding a narrative and fitting new facts into it, and learning a model from new facts, is the difference between rationalization and rationality.

“Taking weird ideas seriously” is also a prerequisite (because some weird ideas are true, and if you bounce off of them you won’t get far), but again it’s not enough. I shouldn’t really need to convince you of that one.

Okay, then, so what’s a post-rationalist?

The people who identify as such generally don’t want to pin it down, but here’s my attempt at categorizing at least the ones who make sense to me:

A post-rationalist is someone who believes the rationalist project is misguided or impossible, but who likes to use some of the tools and concepts developed by the rationalists.

Of course I’m less confident that this properly defines the cluster, outside of groups like Ribbonfarm where it seems to fit quite well. There are people who view the Sequences (or whatever parts have diffused to them) the way they view Derrida: as one more tool to try on an interesting conundrum, see if it works there, but not really treat it as applicable across the board.

And there are those who talk about being a fox rather than a hedgehog (and therefore see trying to reconcile one’s models across domains as being harmful), and those who talk about how the very attempt is a matter of hubris, that not only can we not know the universe, we cannot even realistically aspire to decent calibration.

And then, of course:

A rationalist-adjacent is someone who enjoys spending time with some clusters of rationalists (and/​or enjoys discussing some topics with rationalists), but who is not interested in doing the whole rationalist thing themself.

Which is not a bad thing at all! It’s honestly a good sign of a healthy community that the community appeals even to people for whom the project doesn’t appeal, and the rationalist-adjacents may be more psychologically healthy than the rationalists.

The real issue of contention, as far as I’m concerned, is something I’ve saved for the end: that not everyone who self-identifies as a rationalist fits the first definition very well, and that the first definition is in fact a more compact cluster than self-identification.

And that makes this community, and this site, a bit tricky to navigate. There are rationalist-adjacents for whom a double-crux on many topics would fail because they’re not interested in zooming in so close on a belief. There are post-rationalists for whom a double-crux would fail because they can just switch frames on the conversation any time they’re feeling stuck. And to try to double-crux with someone, only to have it fail in either of those ways, is an infuriating feeling for those of us who thought we could take it for granted in the community.

I don’t yet know of an intervention for signaling that a conversation is happening on explicitly rationalist norms- it’s hard to do that in a way that others won’t feel pressured to insist they’d follow. But I wish there were one.