I think there’s a single framework that captures several of these ideas: the status quo is sticky, and it’s hard to steer out of, so if you think things are bad enough, you shake things up into a transient chaotic state and then as it settles down, you can guide the system into a state you like better. Different groups executing this strategy may seem aligned at the beginning, and their conflicting goals become apparent only after the status quo has been toppled. And it’s possible to lose badly enough in this second phase that you end up worse off than before, so that’s the gamble one takes when engaging in revolution instead of incrementalism.
That seems roughly right but I don’t think chaos/revolution is necessary for this thesis.
Imagine the world sitting in a deep basin. You’re trying to get it out. You can try a revolution: launch it up into the air and hold your breath waiting for it to land.
Or you can try, sisyphus-style, to roll it out. You might get a bunch of allies, who you try and lead along a narrow and precarious ridgeline to your destination. Then, when you’re halfway to the top, some might notice that there’s a much easier path requiring much less of a push, and so the world gets pushed down off your intended route along the mountain pass and into a new crater.
The latter seems to have been the case with environmentalism and AI Safety. They pushed the world fairly gradually, but it only takes a few individuals to push the world downwards into a new equilibrium, for the upward pushers to lose that fight.
(There’s a Yudkowsky tweet somewhere which says something like this, which I’ve basically based the entire metaphor here on)
I think there’s a single framework that captures several of these ideas: the status quo is sticky, and it’s hard to steer out of, so if you think things are bad enough, you shake things up into a transient chaotic state and then as it settles down, you can guide the system into a state you like better. Different groups executing this strategy may seem aligned at the beginning, and their conflicting goals become apparent only after the status quo has been toppled. And it’s possible to lose badly enough in this second phase that you end up worse off than before, so that’s the gamble one takes when engaging in revolution instead of incrementalism.
That seems roughly right but I don’t think chaos/revolution is necessary for this thesis.
Imagine the world sitting in a deep basin. You’re trying to get it out. You can try a revolution: launch it up into the air and hold your breath waiting for it to land.
Or you can try, sisyphus-style, to roll it out. You might get a bunch of allies, who you try and lead along a narrow and precarious ridgeline to your destination. Then, when you’re halfway to the top, some might notice that there’s a much easier path requiring much less of a push, and so the world gets pushed down off your intended route along the mountain pass and into a new crater.
The latter seems to have been the case with environmentalism and AI Safety. They pushed the world fairly gradually, but it only takes a few individuals to push the world downwards into a new equilibrium, for the upward pushers to lose that fight.
(There’s a Yudkowsky tweet somewhere which says something like this, which I’ve basically based the entire metaphor here on)