Here’s the exit.

There’s a kind of game here on Less Wrong.

It’s the kind of game that’s a little rude to point out. Part of how it works is by not being named.

Or rather, attempts to name it get dissected so everyone can agree to continue ignoring the fact that it’s a game.

So I’m going to do the rude thing. But I mean to do so gently. It’s not my intention to end the game. I really do respect the right for folk to keep playing it if they want.

Instead I want to offer an exit to those who would really, really like one.

I know I really super would have liked that back in 2015 & 2016. That was the peak of my hell in rationalist circles.

I’m watching the game intensify this year. Folk have been talking about this a lot. How there’s a ton more talk of AI here, and a stronger tone of doom.

I bet this is just too intense for some folk. It was for me when I was playing. I just didn’t know how to stop. I kind of had to break down in order to stop. All the way to a brush with severe depression and suicide.

And it also ate parts of my life I dearly, dearly wish I could get back.

So, in case this is audible and precious to some of you, I’d like to point a way to ease.

The Apocalypse Game

The upshot is this:

You have to live in a kind of mental illusion to be in terror of the end of the world.

Illusions don’t look on the inside like illusions. They look like how things really are.

Part of how this one does the “daughter’s arm” thing is by redirecting attention to facts and arguments.

  • “Here’s why the argument about AI makes sense.”

  • “Do you have some alternative view of what will happen? How do you address XYZ?”

  • “What makes it an ‘illusion’? I challenge that framing because it dismisses our ability to analyze and understand yada yada.”

None of this is relevant.

I’m pointing at something that comes before these thoughts. The thing that fuels the fixation on the worldview.

I also bet this is the thing that occasionally drives some people in this space psychotic, depressed, or into burnout.

The basic engine is:

  • There’s a kind of underlying body-level pain. I would tag this as “emotional pain” but it’s important to understand that I really am pointing at physical sensations.

  • The pain is kind of stored and ignored. Often it arose from a very young age but was too overwhelming, so child-you found methods of distraction.

  • This is the basic core of addiction. Addictions are when there’s an intolerable sensation but you find a way to bear its presence without addressing its cause. The more that distraction becomes a habit, the more that’s the thing you automatically turn to when the sensation arises. This dynamic becomes desperate and life-destroying to the extent that it triggers a red queen race.

  • A major unifying flavor of the LW attractor is intense thought as an addictive distraction. And the underlying flavor of pain that fuels this addiction is usually some variation of fear.

  • In not-so-coincidental analogy to uFAI, these distracting thoughts can come to form autonomous programs that memetically evolve to have something like survival and reproductive instincts — especially in the space between people as they share and discuss these thoughts with each other.

  • The rationalist memeplex focuses on AI Ragnarok in part because it’s a way for the intense thought to pull fuel from the underlying fear.

In this case, the search for truth isn’t in service to seeing reality clearly. The logic of economic races to the bottom, orthogonality, etc. might very well be perfectly correct.

But these thoughts are also (and in some cases, mostly) in service to the doomsday meme’s survival.

But I know that thinking of memes as living beings is something of an ontological leap in these parts. It’s totally compatible with the LW memeplex, but it seems to be too woo-adjacent and triggers an unhelpful allergic response.

So I suggested a reframe at the beginning, which I’ll reiterate here:

Your body’s fight-or-flight system is being used as a power source to run a game, called “OMG AI risk is real!!!”

And part of how that game works is by shoving you into a frame where it seems absolutely fucking real. That this is the truth. This is how reality just is.

And this can be fun!

And who knows, maybe you can play this game and “win”. Maybe you’ll have some kind of real positive impact that matters outside of the game.

But… well, for what it’s worth, as someone who turned off the game and has reworked his body’s use of power quite a lot, it’s pretty obvious to me that this isn’t how it works. If playing this game has any real effect on the true world situation, it’s to make the thing you’re fearing worse.

(…which is exactly what’s incentivized by the game’s design, if you’ll notice.)

I want to emphasize — again — that I am not saying that AI risk isn’t real.

I’m saying that really, truly orienting to that issue isn’t what LW is actually about.

That’s not the game being played here. Not collectively.

But the game that is being played here absolutely must seem on the inside like that is what you’re doing.

Ramping Up Intensity

When Eliezer rang the doom bell, my immediate thought was:

“Ah, look! The gamesmaster has upped the intensity. Like preparing for a climax!”

I mean this with respect and admiration. It’s very skillful. Eliezer has incredible mastery in how he weaves terror and insight together.

And I don’t mean this at all to dismiss what he’s saying. Though I do disagree with him about overall strategy. But it’s a sincere disagreement, not a “Oh look, what a fool” kind of thing.

What I mean is, it’s a masterful move of making the game even more awesome.

(…although I doubt he consciously intended it that way!)

I remember when I was in the thick of this AI apocalypse story, everything felt so… epic. Even questions of how CFAR dealt with garbage at its workshops seemed directly related to whether humanity would survive the coming decades. The whole experience was often thrilling.

And on the flipside, sometimes I’d collapse. Despair. “It’s too much” or “Am I even relevant?” or “I think maybe we’re just doomed.”

These are the two sort of built-in physiological responses to fight-or-flight energy: activation, or collapse.

(There’s a third, which is a kind of self-holding. But it has to be built. Infants aren’t born with it. I’ll point in that direction a bit later.)

In the spirit of feeling rationally, I’d like to point out something about this use of fight-or-flight energy:

If your body’s emergency mobilization systems are running in response to an issue, but your survival doesn’t actually depend on actions on a timescale of minutes, then you are not perceiving reality accurately.

Which is to say: If you’re freaked out but rushing around won’t solve the problem, then you’re living in a mental hallucination. And it’s that hallucination that’s scaring your body.

Again, this isn’t to say that your thoughts are incorrectly perceiving a future problem.

But if it raises your blood pressure or quickens your breath, then you haven’t integrated what you’re seeing with the reality of your physical environment. Where you physically are now. Sitting here (or whatever) reading this text.

So… folk who are wringing their hands and feeling stressed about the looming end of the world via AI?

Y’all are hallucinating.

If you don’t know what to do, and you’re using anxiety to power your minds to figure out what to do…

…well, that’s the game.

The real thing doesn’t work that way.

But hey, this sure is thrilling, isn’t it?

As long as you don’t get stuck in that awful collapse space, or go psychotic, and join the fallen.

But the risk of that is part of the fun, isn’t it?

(Interlude)

A brief interlude before I name the exit.

I want to emphasize again that I’m not trying to argue anyone out of doing this intense thing.

The issue is that this game is way, way out of range for lots of people. But some of those people keep playing it because they don’t know how to stop.

And they often don’t even know that there’s something on this level to stop.

You’re welcome to object to my framing, insist I’m missing some key point, etc.

Frankly I don’t care.

I’m not writing this to engage with the whole space in some kind of debate about AI strategy or landscape or whatever.

I’m trying to offer a path to relief to those who need it.

That no, this doesn’t have to be the end of the world.

And no, you don’t have to grapple with AI to sort out this awful dread.

That’s not where the problem really is.

I’m not interested in debating that. Not here right now.

I’m just pointing out something for those who can, and want to, hear it.

Land on Earth and Get Sober

So, if you’re done cooking your nervous system and want out…

…but this AI thing gosh darn sure does look too real to ignore…

…what do you do?

My basic advice here is to land on Earth and get sober.

The thing driving this is a pain. You feel that pain when you look out at the threat and doom of AI, but you cover it up with thoughts. You pretend it’s about this external thing.

I promise, it isn’t.

I know. I really do understand. It really truly looks like it’s about the external thing.

But… well, you know how when something awful happens and gets broadcast (like the recent shooting), some people look at it with a sense of “Oh, that’s really sad” and are clearly impacted, while others utterly flip their shit?

Obviously the difference there isn’t in the event, or in how they heard about it. Maybe sometimes, but not mostly.

The difference is in how the event lands for the listener. What they make it mean. What bits of hidden pain are ready to be activated.

You cannot orient in a reasonable way to something that activates and overwhelms you this way. Not without tremendous grounding work.

So rather than believing the distracting thoughts that you can somehow alleviate your terror and dread with external action…

…you’ve got to stop avoiding the internal sensation.

When I talked earlier about addiction, I didn’t mean that just as an analogy. There’s a serious withdrawal experience that happens here. Withdrawal from an addiction is basically a heightening of the intolerable sensation (along with having to fight mechanical habits of seeking relief via the addictive “substance”).

So in this case, I’m talking about all this strategizing, and mental fixation, and trying to model the AI situation.

I’m not saying it’s bad to do these things.

I’m saying that if you’re doing them as a distraction from inner pain, you’re basically drunk.

You have to be willing to face the awful experience of feeling, in your body, in an inescapable way, that you are terrified.

I sort of want to underline that “in your body” part a bazillion times. This is a spot I keep seeing rationalists miss — because the preferred recreational drug here is disembodiment via intense thinking. You’ve got to be willing to come back, again and again, to just feeling your body without story. Notice how you’re looking at a screen, and can feel your feet if you try, and are breathing. Again and again.

It’s also really, really important that you do this kindly. It’s not a matter of forcing yourself to feel what’s present all at once. You might not even be able to find the true underlying fear! Part of the effect of this particular “drug” is letting the mind lead. Making decisions based on mental computations. And kind of like minds can get entrained to porn, minds entrained to distraction via apocalypse fixation will often hide their power source from their host.

(In case that was too opaque for you just yet, I basically just said “Your thoughts will do what they can to distract you from your true underlying fear.” People often suddenly go blank inside when they look inward this way.)

So instead of trying to force it all at once, it’s a matter of titrating your exposure. Noticing that AI thoughts are coming up again, and pausing, and feeling what’s going on in your body. Taking a breath for a few seconds. And then carrying on with whatever.

This is slow work. Unfortunately your “drug” supply is internal, so getting sober is quite a trick.

But this really is the exit. As your mind clears up… well, it’s very much like coming out of the fog of a bender and realizing that no, really, those “great ideas” you had just… weren’t great. And now you’re paying the price on your body (and maybe your credit card too!).

There are tons of resources for this kind of direction. It gets semi-independently reinvented a lot, so there are lots of different names and frameworks for this. One example that I expect to be helpful for at least some LWers who want to land on Earth & get sober is Irene Lyon, who approaches this through a “trauma processing” framework. She offers plenty of free material on YouTube. Her angle is in the same vein as Gabor Maté and Peter Levine.

But hey, if you can feel the thread of truth in what I’m saying and want to pursue this direction, but you find you can’t engage with Irene Lyon’s approach, feel free to reach out to me. I might be able to find a different angle for you. I want anyone who wants freedom to find it.

But… but Val… what about the real AI problem?!

Okay, sure. I’ll say a few words here.

…although I want to point out something: The need to have this answered is coming from the addiction to the game. It’s not coming from the sobriety of your deepest clarity.

That’s actually a complete answer, but I know it doesn’t sound like one, so I’ll say a little more.

Yes, there’s a real thing.

And yes, there’s something to do about it.

But you’re almost certainly not in a position to see the real thing clearly or to know what to do about it.

And in fact, attempts to figure the real thing out and take action from this drunk gamer position will make things worse.

(I hesitate to use the word “worse” here. That’s not how I see it. But I think that’s how it translates to the in-game frame.)

This is what Buddhists should have meant (and maybe did/​do?) when they talk about “karma”. How deeply entangled in this game is your nervous system? Well, when you let that drive how you interact with others, their bodies get alarmed in similar ways, and they get more entangled too.

Memetic evolution drives how that entangling process happens on large scales. When that becomes a defining force, you end up with self-generating pockets of Hell on Earth.

This recent thing with FTX is totally an example. Totally. Threads of karma/​trauma/​whatever getting deeply entangled and knotted up and tight enough that large-scale flows of collective behavior create an intensely awful situation.

You do not solve this by trying harder. Tugging the threads harder.

In fact, that’s how you make it worse.

This is what I meant when I said that actually dealing with AI isn’t the true game in LW-type spaces, even though it sure seems like it on the inside.

It’s actually helpful to the game for the situation to constantly seem barely maybe solvable but to have major setbacks.

And this really can arise from having a sincere desire to deal with the real problem!

But that sincere desire, when channeled into the Matrix of the game, doesn’t have any power to do the real thing. There’s no leverage.

The real thing isn’t thrilling this way. It’s not epic.

At least, not any more epic than holding someone you love, or taking a stroll through a park.

To oversimplify a bit: You cannot meaningfully help with the real thing until you’re sober.

Now, if you want to get sober and then you roll up your sleeves and help…

…well, fuck yeah! Please. Your service would be a blessing to all of us. Truly. We need you.

But it’s gotta come from a different place. Tortured mortals need not apply.

And frankly, the reason AI in particular looks like such a threat is because you’re fucking smart. You’re projecting your inner hell onto the external world. Your brilliant mind can create internal structures that might damn well take over and literally kill you if you don’t take responsibility for this process. You’re looking at your own internal AI risk.

I hesitate to point that out because I imagine it creating even more body alarm.

But it’s the truth. Most people wringing their hands about AI seem to let their minds possess them more and more, and pour more & more energy into their minds, in a kind of runaway process that’s stunningly analogous to uFAI.

The difference is, you don’t have to make the entire world change in order to address this one.

You can take coherent internal action.

You can land on Earth and get sober.

That’s the internal antidote.

It’s what offers relief — eventually.

And from my vantage point, it’s what leads to real hope for the world.