I think it might be more accurate to say you’re an efficient component in Moloch’s machine.
But if you care about how things like gradual disempowerment play out, then I think the “baddie / goodie / pawn-of-Moloch” framing is probably not very useful. It might be worth instead thinking more concretely, about things like
How much are my actions contributing to speeding up human disempowerment? [1]
How could I keep my job (or whatever) while contributing as little as possible to various bad things?
Who are the relevant actors I would need to coordinate with, in order to slow things down? What, concretely, is stopping me from coordinating with them, and how could I fix that?
What other important considerations are there, besides “speeding up adoption of AI / replacement of humans”?
What could I do to offset harms I cause?
- ↩︎
accounting for the other actors in the Molochian race
Imagine telling someone from (e.g.) 2016 that the above sentence is a reasonable thing to say in 2026. (The frog, it boils.)