On transitional risks: The separation equilibrium describes a potential end state, not the path to it. The transition would be extremely dangerous. While a proto-AGI might recognize this equilibrium as optimal during development (potentially reducing some risks), an emerging ASI could still harm humans while determining its resource needs or pursuing instrumental goals. Nothing guarantees safe passage through this phase.
On building ASI: There is indeed no practical use in deliberately creating ASI that outweighs the risks. If separation is the natural equilibrium:
Best case: We keep useful AGI tools below self-improvement thresholds
Middle case: ASI emerges but separates without destroying us
Worst case: Extinction during transition
This framework suggests avoiding ASI development entirely is optimal. If separation is inevitable, we gain minimal benefits while facing enormous transitional risks.
To expand, the reason why this thesis is important nonetheless, is because I don’t believe that the best case scenario is likely or compatible with the way things currently are. Accidentally creating ASI is almost guaranteed to happen at one point or another. As such, the biggest points of investment should be:
Surviving the transitional period
Establishing mechanisms for negotiation in an equilibrium state
You’re right on both counts.
On transitional risks: The separation equilibrium describes a potential end state, not the path to it. The transition would be extremely dangerous. While a proto-AGI might recognize this equilibrium as optimal during development (potentially reducing some risks), an emerging ASI could still harm humans while determining its resource needs or pursuing instrumental goals. Nothing guarantees safe passage through this phase.
On building ASI: There is indeed no practical use in deliberately creating ASI that outweighs the risks. If separation is the natural equilibrium:
Best case: We keep useful AGI tools below self-improvement thresholds
Middle case: ASI emerges but separates without destroying us
Worst case: Extinction during transition
This framework suggests avoiding ASI development entirely is optimal. If separation is inevitable, we gain minimal benefits while facing enormous transitional risks.
To expand, the reason why this thesis is important nonetheless, is because I don’t believe that the best case scenario is likely or compatible with the way things currently are. Accidentally creating ASI is almost guaranteed to happen at one point or another. As such, the biggest points of investment should be:
Surviving the transitional period
Establishing mechanisms for negotiation in an equilibrium state