I think the harder the theoretical doom plan it is the easier it is to control at least until alignment research catches up. It’s important because obsessing over unlikely scenarios that make the problem harder than it is can exclude potential solutions.
I think the harder the theoretical doom plan it is the easier it is to control at least until alignment research catches up. It’s important because obsessing over unlikely scenarios that make the problem harder than it is can exclude potential solutions.