You are right that we should include in our considerations ASI as a way out of the acute risk period.
When you do that, it doesn’t change the answer much. You still get the answer that we should be very very careful in how we build it. This is based on what we can now reasonably guess about alignment (based on the outside view that there are experts and difficult-to-evaluate but strong arguments for everything from 1% to 99%). It could very well be extremely risky. Which means from any reasonable epistemic view it’s effectively quite risky from our current perspective of ignorance.
Meanwhile the risk of nuclear annihilation (which wouldn’t end the species, just set it back a hundred or a thousand years) and other sources of disaster in the same category are perhaps 1% per year. But they have far less longterm consequence.
Even if you don’t care a whit for future generations (as Bostrom has assumed without clearly stating here), or you’re about to die and ALSO don’t care a whit for everyone currently alive who would live longer,
Factoring in how it would end the critical risk period, you STILL shouldn’t be rushing to AGI in the next few years.
You are right that we should include in our considerations ASI as a way out of the acute risk period.
When you do that, it doesn’t change the answer much. You still get the answer that we should be very very careful in how we build it. This is based on what we can now reasonably guess about alignment (based on the outside view that there are experts and difficult-to-evaluate but strong arguments for everything from 1% to 99%). It could very well be extremely risky. Which means from any reasonable epistemic view it’s effectively quite risky from our current perspective of ignorance.
Meanwhile the risk of nuclear annihilation (which wouldn’t end the species, just set it back a hundred or a thousand years) and other sources of disaster in the same category are perhaps 1% per year. But they have far less longterm consequence.
Even if you don’t care a whit for future generations (as Bostrom has assumed without clearly stating here), or you’re about to die and ALSO don’t care a whit for everyone currently alive who would live longer,
Factoring in how it would end the critical risk period, you STILL shouldn’t be rushing to AGI in the next few years.
Like we’re currently doing.