I don’t feel I can rule out slow/weird scenarios like those you describe, or where extinction is fast but comes considerably after disempowerment, or where industrial civilization is destroyed quickly but it’s not worth mopping up immediately—“what happens after AI takes over” is by nature extremely difficult to predict. Very fast disasters are also plausible, of course.
I don’t feel I can rule out slow/weird scenarios like those you describe, or where extinction is fast but comes considerably after disempowerment, or where industrial civilization is destroyed quickly but it’s not worth mopping up immediately—“what happens after AI takes over” is by nature extremely difficult to predict. Very fast disasters are also plausible, of course.