Factorio, Accelerando, Empathizing with Empires and Moderate Takeoffs

I started planning this post before Cousin It’s post on a similar subject. This is a bit of a more poetic take on moderate, peaceful AI takeoffs.

Spoilers for the game Factorio, and the book Accelerando. They are both quite good. But, if you’re not going to get around to play/​reading them for awhile, I’d just go ahead and read this thing (I think the game and story are still good if you go in knowing some plot elements).

i. Factorio

Factorio is a computer game about automation.

It begins with you crash landing on a planet. Your goal is to go home. To go home, you need to build a rocket. To build a rocket powerful enough to get back to your home solar system, you will need advanced metallurgy, combustion engines, electronics, etc. To get those things, you’ll need to bootstrap yourself from the stone age to the nuclear age.

To do this all by yourself, you must automate as much of the work as you can.

To do this efficiently, you’ll need to build stripmines, powerplants, etc. (And, later, automatic tools to build stripmines and powerplants).

One wrinkle you may run into is that there are indigenous creatures on the planet.

They look like weird creepy bugs. It is left ambiguous how sentient the natives are, and how they should factor into your moral calculus. But regardless, it becomes clear that the more you pollute, the more annoyed they will be, and they will begin to attack your base.

If you’re like me, this might make you feel bad.

During my last playthrough, I tried hard not to kill things I didn’t have to, and pollute as minimally as possible. I built defenses in case the aliens attacked, but when I ran out of iron, I looked for new mineral deposits that didn’t have nearby native colonies. I bootstrapped my way to solar power as quickly as possible, replacing my smog-belching furnaces with electric ones.

I needed oil, though.

And the only oil fields I could find were right in the middle of an alien colony.

I stared at the oil field for a few minutes, thinking about how convenient it would be if that alien colony wasn’t there. I stayed true to my principles. “I’ll find another way”, I said. And eventually, at much time cost, I found another oil field.

But around this time, I realized that one of my iron mines was near some native encampments. And those natives started attacking me on a regular basis. I built defenses, but they started attacking harder.

Turns out, just because someone doesn’t literally live in a place doesn’t mean they’re happy with you moving into their territory. The attacks grew more frequent.

Eventually I discovered the alien encampment was… pretty small. It would not be that difficult for me to destroy it. And, holy hell, would it be so much easier if that encampment didn’t exist. There’s even a sympathetic narrative I could paint for myself, where so many creatures were dying every day as they went to attack my base, that it was in fact merciful to just quickly put down the colony.

I didn’t do that. (Instead, I actually got distracted and died). But this gave me a weird felt sense, perhaps skill, of empathizing with the British Empire. (Or, most industrial empires, modern or ancient).

Like, I was trying really hard not to be a jerk. I was just trying to go home. And it still was difficult not to just move in and take stuff when I wanted. And although this was a video game, I think in real life it might have been if anything harder, since I’d be risking not just losing the game but losing my life or livelihood of people I cared about.

So when I imagine industrial empires that weren’t raised by hippy-ish parents who believe colonialism and pollution were bad… well, what realistically would you expect to happen when they interface with less powerful cultures?

ii. Accelerando

Accelerando is a book about a fairly moderate takeoff of AGI.

Each chapter takes place 10 years after the previous one. There’s a couple decade transition from “complex systems of narrow AIs start being relevant”, “the first uploads and human/​machine interfaces”, to “true AGI is a major player in the Earth and solar system.”

The outcome here… reasonably good, as things go. The various posthuman actors adopt a “leave Earth alone” policy—there’s plenty of other atoms in the solar system. They start building modular chunks of a dyson sphere, using Mercury and other hard-surface planets as resources. (A conceit of the book is that gas giants are harder to work with, so Jupiter et al remain more or less in their present form)

The modular dyson sphere is solar powered, and it’s advantageous to move your computronium as close as possible to the sun. Agents running on hardware closer to the sun get to think faster, which lets them outcompete those further away.

There are biological humans who don’t do any kind of uploading or neural interfacing. There are biological-esque humans who use various augmentations but don’t focus all their attention on competing on the fastest timescales with the most advanced posthumans.

The posthumans eventually disassemble all the terrestrial planets and asteroids. What’s left are the gas giants (hard to dissassemble) and Earth.

Eventually (where by “eventually” I mean, “in a couple decades”), they go to great lengths to transport the surface of Earth in such a way that the denizens get to retain something like their ancestral home, but the core of the Earth can be used to build more computronium.

And then, in another decade or two (many generations from the perspectives of posthumans running close to the Sun’s heat), our posthuman offspring take another look at this last chunk of atoms...

...and sort of shrug apologetically and wring their metaphorical hands and then consume the last atoms in the solar system.

(By this point, old-school humans have seen the writing on the wall and departed the solar system)

iii. Moderate Takeoffs

I periodically converse with people who argue something like: “moderate takeoff of AGI is most likely, and that there’ll be time to figure out what to do about it (in particular if humans get to be augmenting themselves or using increasingly powerful tools to improve their strategizing).”

And… this just doesn’t seem that comforting to me. In the most optimistic possible worlds I imagine (where we don’t get alignment exactly right, but a moderate takeoff makes it easier to handle), human level AI takes several decades, the people who don’t upload are outcompeted, and the final hangwringing posthuman shrug takes… maybe a few centuries max?

And maybe this isn’t the worst outcome. Maybe the result is still some kind of sentient posthuman society, engaged in creativity and various positive experiences that I’d endorse, which then goes on to colonize the universe. And it’s sad that humans and non-committed transhumans got outcompeted but at least there’s still some kind of light in the universe.

But still, this doesn’t seem like an outcome I’m enthusiastic about. It’s at least not something I’d want to happen by default without reflecting upon whether we could do better. Even if you’re expecting a moderate takeoff, it still seems really important to get things right on the first try.