I expect that human extinction would follow from comprehensive disempowerment within decades.
Why would it take decades? (In contrast to scenarios where AI builds nanotech and quickly disassembles us for spare atoms.) Are you imagining a world where AI-powered corporations, governments, &c. are still mostly behaving as designed, but we have no way to change course when it turns out that industrial byproducts are slowly poisoning us, or …?
I don’t feel I can rule out slow/weird scenarios like those you describe, or where extinction is fast but comes considerably after disempowerment, or where industrial civilization is destroyed quickly but it’s not worth mopping up immediately—“what happens after AI takes over” is by nature extremely difficult to predict. Very fast disasters are also plausible, of course.
Why would it take decades? (In contrast to scenarios where AI builds nanotech and quickly disassembles us for spare atoms.) Are you imagining a world where AI-powered corporations, governments, &c. are still mostly behaving as designed, but we have no way to change course when it turns out that industrial byproducts are slowly poisoning us, or …?
I don’t feel I can rule out slow/weird scenarios like those you describe, or where extinction is fast but comes considerably after disempowerment, or where industrial civilization is destroyed quickly but it’s not worth mopping up immediately—“what happens after AI takes over” is by nature extremely difficult to predict. Very fast disasters are also plausible, of course.