Intelligence Explosion Microeconomics discusses the kinds of open questions we’d need to answer in order to know whether or not there will be such a band.
Section 2.3. (and its subsections) of Responses to Catastrophic AGI Risk also discusses three different types of FOOM that might be possible: hardware overhang, speed explosion, and an intelligence explosion. Your argument should probably address all three.
Edit: It looks like section 4, AGI Containment, covers many of my thoughts and comes to a pretty similar conclusion: External constraints on AGI are an imperfect plan, but potentially valuable and complementary to other safety approaches.
Intelligence Explosion Microeconomics discusses the kinds of open questions we’d need to answer in order to know whether or not there will be such a band.
Section 2.3. (and its subsections) of Responses to Catastrophic AGI Risk also discusses three different types of FOOM that might be possible: hardware overhang, speed explosion, and an intelligence explosion. Your argument should probably address all three.
Thanks. I’ll look over these.
Edit: It looks like section 4, AGI Containment, covers many of my thoughts and comes to a pretty similar conclusion: External constraints on AGI are an imperfect plan, but potentially valuable and complementary to other safety approaches.