Rationalists, Post-Rationalists, And Rationalist-Adjacents

Epistemic sta­tus: Hor­ta­tive. I’m try­ing to ar­gue for carv­ing re­al­ity at a new joint.

I think it’s lovely and use­ful that we have la­bels, not just for ra­tio­nal­ist, but for ra­tio­nal­ist-ad­ja­cent and for post-ra­tio­nal­ist. But these la­bels are gen­er­ally made ex­ten­sion­ally, by point­ing at peo­ple who claim those la­bels, rather than in­ten­sion­ally, by try­ing to dis­till what dis­t­in­guishes those clusters.

I have some in­ten­sional defi­ni­tions that I’ve been hon­ing for a long time. Here’s the biggest one.

A ra­tio­nal­ist, in the sense of this par­tic­u­lar com­mu­nity, is some­one who is try­ing to build and up­date a unified prob­a­bil­is­tic model of how the en­tire world works, and try­ing to use that model to make pre­dic­tions and de­ci­sions.

By “unified” I mean de­com­part­men­tal­ized- if there’s a do­main where the model gives two in­com­pat­i­ble pre­dic­tions, then as soon as that’s no­ticed it has to be rec­tified in some way.

And it’s im­por­tant that it be prob­a­bil­is­tic- it’s perfectly con­sis­tent to re­solve a con­flict be­tween pre­dic­tions by say­ing “I cur­rently think the an­swer is X with about 60% prob­a­bil­ity, and Y with about 25% prob­a­bil­ity, and with about 15% prob­a­bil­ity I’m miss­ing the cor­rect op­tion or con­fused about the na­ture of the ques­tion en­tirely”.

The Se­quences are aimed at peo­ple try­ing to do ex­actly this thing, and Eliezer fo­cuses on how to not go hor­ribly wrong in the pro­cess (with a spe­cial fo­cus on not trust­ing one’s own sense of ob­vi­ous­ness).

Be­ing a ra­tio­nal­ist isn’t about any spe­cific set of con­clu­sions- it’s not about be­ing an effec­tive al­tru­ist, or a util­i­tar­ian, or even an athe­ist. It’s about whether one is try­ing to do that thing or not. Even if one is do­ing a ter­rible job of it!

Truth-seek­ing is a pre­req­ui­site, but it’s not enough. It’s pos­si­ble to be very dis­ci­plined about find­ing and as­sem­bling true facts, with­out thereby chang­ing the way one thinks about the world. As a con­trast, here’s how the New York Times, whose fact-check­ing qual­ity is not in dis­pute, de­cides what to re­port:

By and large, tal­ented re­porters scram­bled to match sto­ries with what in­ter­nally was of­ten called “the nar­ra­tive.” We were oc­ca­sion­ally asked to map a nar­ra­tive for our var­i­ous beats a year in ad­vance, square the plan with ed­i­tors, then gen­er­ate sto­ries that fit the pre-des­ig­nated line.

The differ­ence be­tween wield­ing a nar­ra­tive and fit­ting new facts into it, and learn­ing a model from new facts, is the differ­ence be­tween ra­tio­nal­iza­tion and ra­tio­nal­ity.

“Tak­ing weird ideas se­ri­ously” is also a pre­req­ui­site (be­cause some weird ideas are true, and if you bounce off of them you won’t get far), but again it’s not enough. I shouldn’t re­ally need to con­vince you of that one.

Okay, then, so what’s a post-ra­tio­nal­ist?

The peo­ple who iden­tify as such gen­er­ally don’t want to pin it down, but here’s my at­tempt at cat­e­go­riz­ing at least the ones who make sense to me:

A post-ra­tio­nal­ist is some­one who be­lieves the ra­tio­nal­ist pro­ject is mis­guided or im­pos­si­ble, but who likes to use some of the tools and con­cepts de­vel­oped by the ra­tio­nal­ists.

Of course I’m less con­fi­dent that this prop­erly defines the cluster, out­side of groups like Rib­bon­farm where it seems to fit quite well. There are peo­ple who view the Se­quences (or what­ever parts have diffused to them) the way they view Der­rida: as one more tool to try on an in­ter­est­ing co­nun­drum, see if it works there, but not re­ally treat it as ap­pli­ca­ble across the board.

And there are those who talk about be­ing a fox rather than a hedge­hog (and there­fore see try­ing to rec­on­cile one’s mod­els across do­mains as be­ing harm­ful), and those who talk about how the very at­tempt is a mat­ter of hubris, that not only can we not know the uni­verse, we can­not even re­al­is­ti­cally as­pire to de­cent cal­ibra­tion.

And then, of course:

A ra­tio­nal­ist-ad­ja­cent is some­one who en­joys spend­ing time with some clusters of ra­tio­nal­ists (and/​or en­joys dis­cussing some top­ics with ra­tio­nal­ists), but who is not in­ter­ested in do­ing the whole ra­tio­nal­ist thing them­self.

Which is not a bad thing at all! It’s hon­estly a good sign of a healthy com­mu­nity that the com­mu­nity ap­peals even to peo­ple for whom the pro­ject doesn’t ap­peal, and the ra­tio­nal­ist-ad­ja­cents may be more psy­cholog­i­cally healthy than the ra­tio­nal­ists.

The real is­sue of con­tention, as far as I’m con­cerned, is some­thing I’ve saved for the end: that not ev­ery­one who self-iden­ti­fies as a ra­tio­nal­ist fits the first defi­ni­tion very well, and that the first defi­ni­tion is in fact a more com­pact cluster than self-iden­ti­fi­ca­tion.

And that makes this com­mu­nity, and this site, a bit tricky to nav­i­gate. There are ra­tio­nal­ist-ad­ja­cents for whom a dou­ble-crux on many top­ics would fail be­cause they’re not in­ter­ested in zoom­ing in so close on a be­lief. There are post-ra­tio­nal­ists for whom a dou­ble-crux would fail be­cause they can just switch frames on the con­ver­sa­tion any time they’re feel­ing stuck. And to try to dou­ble-crux with some­one, only to have it fail in ei­ther of those ways, is an in­furi­at­ing feel­ing for those of us who thought we could take it for granted in the com­mu­nity.

I don’t yet know of an in­ter­ven­tion for sig­nal­ing that a con­ver­sa­tion is hap­pen­ing on ex­plic­itly ra­tio­nal­ist norms- it’s hard to do that in a way that oth­ers won’t feel pres­sured to in­sist they’d fol­low. But I wish there were one.