I think this would be the correct thing to do for an appropriately safety-minded AI company. This is a strong signal to governments to wake up to AI x-risk, and it puts pressure on other companies to take safety more seriously.
Given (1) uncertainty about timelines/takeoff speeds and (2) uncertainty about how long it would take for the good effects to pay off, I think a safety-minded AI company should shut down immediately. (I think a year ago would have also been a good time.)
Another thing a company could do is completely stop working on capabilities, and dedicate 100% of resources to alignment research / other AI safety activities until they run out of money. For a for-profit this would violate its fiduciary duty but I think (IANAL) OpenAI and Anthropic would be on solid legal ground because they’re not pure for-profits. I also think there’s a (correct) argument to be made that decreasing your shareholders’ risk of dying is consistent with fiduciary duty (although I kind of doubt courts would go for that).
I think this would be the correct thing to do for an appropriately safety-minded AI company. This is a strong signal to governments to wake up to AI x-risk, and it puts pressure on other companies to take safety more seriously.
Given (1) uncertainty about timelines/takeoff speeds and (2) uncertainty about how long it would take for the good effects to pay off, I think a safety-minded AI company should shut down immediately. (I think a year ago would have also been a good time.)
Another thing a company could do is completely stop working on capabilities, and dedicate 100% of resources to alignment research / other AI safety activities until they run out of money. For a for-profit this would violate its fiduciary duty but I think (IANAL) OpenAI and Anthropic would be on solid legal ground because they’re not pure for-profits. I also think there’s a (correct) argument to be made that decreasing your shareholders’ risk of dying is consistent with fiduciary duty (although I kind of doubt courts would go for that).