The Alignment Community Is Culturally Broken

Disclaimer: These are entirely my thoughts. I’m posting this before it’s fully polished because it never will be.

Epistemic status: Moderately confident. Deliberately provocative title.

Apparently, the Bay Area rationalist community has a burnout problem. I have no idea if it’s worse than base rate, but I’ve been told it’s pretty bad. I suspect that the way burnout manifests in the rationalist community is uniquely screwed up.

I was crying the other night because our light cone is about to get ripped to shreds. I’m gonna do everything I can to do battle against the forces that threaten to destroy us. You’ve heard this story before. Short timelines. Tick. Tick. I’ve been taking alignment seriously for about a year now, and I’m ready to get serious. I’ve thought hard about what my strengths are. I’ve thought hard about what I’m capable of. I’m dropping out of Stanford, I’ve got something that looks like a plan, I’ve got the rocky theme song playing, and I’m ready to do this.

A few days later, I saw this post. And it reminded me of everything that bothers me about the EA community. Habryka covered the object level problems pretty well, but I need to communicate something a little more… delicate.

I understand that everyone is totally depressed because qualia is doomed. I understand that we really want to creatively reprioritize. I completely sympathize with this.

I want to address the central flaw of Akash+Olivia+Thomas’s argument in the Buying Time post, which is that actually, people can improve at things.

There’s something deeply discouraging about being told “you’re an X% researcher, and if X>Y, then you should stay in alignment. Otherwise, do a different intervention.” No other effective/​productive community does this. I don’t know how to put this, but the vibes are deeply off.

The appropriate level of confidence to have about a statement like “I can tell how good of an alignment researcher you will be after a year of you doing alignment research” feels like it should be pretty low. At a year, there’s almost certainly ways to improve that haven’t been tried. Especially in a community so mimetically allergic to the idea of malleable human potential.

Here’s a hypothesis. I in no way mean to imply that this is the only mechanism by which burnout happens in our community, but I think it’s probably a pretty big one. It’s not nice to be in a community that constantly hints that you might just not be good enough and that you can’t get good enough.

Our community seems to love treating people like mass-produced automatons with a fixed and easily assessable “ability” attribute. (Maybe you flippantly read that sentence and went “yeah it’s called g factor lulz.” In that case, maybe reflect on good of a correlate g is in absolute terms for the things you care about.).

If we want to actually accomplish anything, we need to encourage people to make bigger bets, and to stop stacking up credentials so that fellow EAs think they have a chance. It’s not hubris to believe in yourself.