Twitter thread on postrationalists

I wrote the following as a thread on twitter. Kaj Sotala asked me to share it on LessWrong as well, so I have reproduced it here with minimal editing.

(If you are interested in my twitter content, but don’t use twitter, I recommend reading it on threadreader. The user experience of reading long twitter threads on twitter is pretty frustrating, and threadreader is somewhat better. Also, I think engaging with twitter is net-bad for most people, and I feel morally obliged to make sure my content is hosted elsewhere, so that I don’t add to the incentive for people to do things that are bad for them.)

Here’s my theory of the cluster of people that call themselves post-rationalists.

Long ago, @ESYudkowsky wrote the sequences. The sequences had a lot of explicit content, but they also had a “vibe.”

The vibe was something like “intelligence dominates everything; the way to power and winning is thinking really well. If you think well enough, you will be awesome.”

Also, “arrogance and snark about how insane everyone /​ the world is.”

Also, “ambitious, heroic, glorious.”

People read the sequences, and they got whatever they got out of them, but a LOT of what people picked up on was the “vibe”.

For instance, one major thing that I got from the sequences is that “rationalists should win”. That I should be able to get what I want by being clever, and that I shouldn’t often be in situations where I’m not getting what I want. And I should be indignant about them.

[Note: when I say “indignant” here, I don’t mean an attitude of “this is unfair. Now I will mope.” It was more like, “I’m better than this. I will now rise above.”]

In 2015, this became a sort of mantra for me.

In places where other people would be satisfied with the lemons that life gave them, _I_ should win.

The actual post where it says that rationalists should win is about Newcobm’s problem: it’s a [reasonably] abstract point about the philosophy of rational choice.

I mostly wasn’t, in this instance, making use of the explicitly stated content. It’s not like I was using some _specific_ reasoning technique that is more effective than baseline, and that’s why I would win.

I was adopting an attitude that I picked up on.

(Compare this to the case where a person reads the sequences and starts doing explicit Bayesian calculations, [edit: regularly], and expects to do better that way. In doing that, a person is responding more to the content, and less to the vibe.)

I want to emphasize that there’s a kind of arrogance, here, of “I’m different than other people, such that they sometimes don’t get what they want, but that is beneath my dignity”, which I totally had before reading the sequences.

I didn’t get this entirely from Eliezer. Rather, Eliezer put out a thing that resonated with me.

The sequences matched with, and _justified_ some attitudes that I already had. And they inculcated some new attitudes in me, via mimesis.

That’s not unique to me. Many of the people who were attracted to the sequences, were attracted by the _vibe_, mostly.

It isn’t like people were ignoring the explicit content, but reading the sequences is FUN for some people.

Some of that is because it is just so fascinating.

But some of it is because you either resonate with, or enjoy, or feel ego-boosted, by the vibe.

You’ll like the vibe of the sequences more if you like the the snark and the superior attitude, instead of people who were offended by it. Or if you are proud of your intelligence, maybe?

So the sequences predictably attracted some particular types of people:

  • Some people who are smart and non-functional, and sincerely see “rationality” as a way to turn their life around, which is comfortable for them, because it involves doing what they were already good at: school stuff /​ playing with ideas.

  • Some people who are smart and non-functional, who are less trying to be better, and more looking for a way to feel like _they’re_ actually the superior ones; everyone else is irrational, and that’s what matters.

  • Some people who want to feel important, and found this x-risk narrative as a way to feel important, or have meaning.

  • Some people who resonate a lot with “thinking better is the path to success”, because that has actually worked for them in their experience.

  • Lots and lots of people who were nerd-sniped by the ideas, and find social satisfaction from the rationalist social game: jockeying with each other to show how much you know by bringing to bear community shibboleths and/​or generically “smart frames” on a question.

The “people pick up on the vibe of the sequences” effect is magnified, because if you went to a rationalist meetup, it would be filled with people who self selected on the basis of resonating with or liking the vibe of the sequences.

**And if the explicit content deviates from the vibe, the content sometimes gets lost or muddled.**

[The paragraph below] explicitly says “Don’t be a straw Vulcan. Rationality is not about being emotionless!”

But also, the energetic “feel” of this paragraph, for me, is abstracted, and cognitive, and kind of keeping the emotions at arms length.

And people will often interpret sentences like these according to the vibe that they project on to them.

And the result is that people say words like “rationality is not about only using your S2”, but ALSO they mostly end up trying to do things with explicit S2 thinking.

People manage to do things like ignoring their emotions (while also thinking they’re awesome at emotions).

And it is pretty hard to notice if you’re doing something like this!

So “rationality” for a lot people, is, for better or for worse, largely about that vibe.

(Especially, though not uniquely, for people who are far from the in person community. People who live in the Bay have more feedback channels with how this rationality stuff lives in the likes of Eliezer, and Anna.)

And sometimes people live by that ethos for a while, and then find that it doesn’t serve them.

Like, the stance that they’re taking towards the world, or towards themselves, isn’t good for them, and doesn’t help them get what they want.

The way they relate with their emotions is bad for them.

Or the way they “overthink” things is neurotic instead of helpful.

Or the way that they intellectualize everything is isolating. They’re not actually satisfied with the human connection that they get that way.

And so they reject the ethos which they’ve come to associate with “rationality”.
(Which, they probably learned, in part, from the sequences, and in part had all along.

And, even if they were doing it all along, rationality culture probably gave them some new and better tools and justifications for hurting themselves.)

But when they reject the rationality ethos, something kind of funny happens, because every time they make a specific claim about what they’re rejecting, some rationalist (like me!) will pop up and point out that the specific claim that they’re making about why rationality is not adequate is not only a thing that rationalists are aware of, but is RIGHT THERE in the sequences!

They’ll say something like “rationality is about thinking through everything, explicitly, and that’s not actually effective.”

And then I and others will face-palm and be like “did you READ the sequences? Attend a CFAR workshop? We say in a like a hundred places that the point is not to reason through everything explicitly!”

https://​​twitter.com/​​ESYudkowsky/​​status/​​1462132727583440896?s=20&t=JWoyf4oI_wHmM3ZJgf_pfg

But that’s ok. People picked up on a vibe, and the way they embodied that vibe didn’t get them what they valued, and so they’re correctly pushing back against [something] that they don’t want.

In summary:

It’s all good. People should do things that are good for them.

Communication is hard.

It might be nice if people were more curious about how “rationality” sits in others, instead of fighting about what it is or isn’t. But no one has to do that either. : )