How likely do you think worse-than-extinction type fates to be?

The risks of unaligned-AGI are usually couched in terms of existential risk, whereby the outcome is explicitly or implicitly human extinction. However, given the nature of what an AGI would be able to do, it seems though there are plausible scenarios worse than human extinction. Just as on an individual level it is possible to imagine fates we would prefer death to (e.g. prolonged terminal illness or imprisonment), it is also possible to expand this to humanity as a whole, where we end up in hell-type fates. This could arise from some technical error, wilful design by the initial creators (say religious extremists) or some other unforeseen mishap.

How prominently do these concerns feature for anyone? How likely do you think worse-than-extinction scenarios to be?