Is it possible they already are? I could certainly see AI risks being part of the risk associated with both nuclear and bio threats.
I’m not sure, others here with direct exposure can answer better, that funding is a limiting factor at this point. If not then the budget aspect doesn’t matter. What other constraints might DoD involvement help relax?
As I understand it, the recent US semiconductor policy updates—e.g., CHIPS Act, exportcontrols—are unusuallyextreme, which does seem consistent with the hypothesis that they’re starting to take some AI-related threats more seriously. But my guess is that they’re mostly worried about more mundane/routine impacts on economic and military affairs, etc., rather than about this being the most significant event since the big bang; perhaps naively, I suspect we’d see more obvious signs if they were worried about the latter, a la physics departments clearing out during the Manhattan Project.
Is it possible they already are? I could certainly see AI risks being part of the risk associated with both nuclear and bio threats.
I’m not sure, others here with direct exposure can answer better, that funding is a limiting factor at this point. If not then the budget aspect doesn’t matter. What other constraints might DoD involvement help relax?
As I understand it, the recent US semiconductor policy updates—e.g., CHIPS Act, export controls—are unusually extreme, which does seem consistent with the hypothesis that they’re starting to take some AI-related threats more seriously. But my guess is that they’re mostly worried about more mundane/routine impacts on economic and military affairs, etc., rather than about this being the most significant event since the big bang; perhaps naively, I suspect we’d see more obvious signs if they were worried about the latter, a la physics departments clearing out during the Manhattan Project.
Timelines. USG could unilaterally slow AI progress. (Use your imagination.)