Plausibly high-value governance cause area: mandating human-maintainable infrastructure. That is, even if AI can design us a more efficient design for an energy grid or computer wafer, we only use those if we can keep doing so after pressing any relevant Big Red Buttons on AI.
We do not get this by default, since for any x there are plausibly more efficient versions of x that require continuous AI management, than those which do not.
(A stronger version of this would mandate that designs we accept be ones we can mechanistically understand, not just independently operate.)
This is in the long-term interests of everyone and could marshal the short-term job-protection interests of various lobbies.
Plausibly high-value governance cause area: mandating human-maintainable infrastructure. That is, even if AI can design us a more efficient design for an energy grid or computer wafer, we only use those if we can keep doing so after pressing any relevant Big Red Buttons on AI.
We do not get this by default, since for any x there are plausibly more efficient versions of x that require continuous AI management, than those which do not.
(A stronger version of this would mandate that designs we accept be ones we can mechanistically understand, not just independently operate.)
This is in the long-term interests of everyone and could marshal the short-term job-protection interests of various lobbies.