If quantum immortality is true, you will survive in the worlds where AI will not kill everybody immediately. This could be in several cases:
AI is friendly.
AI keeps humans for some instrumental purpose.
AI is hostile and is going to torture you.
AI doesn’t care about humans nor interested in their atoms and they may die lately because of some environment degradation.
You are a single survivor, whom AI is unable or failed to kill.
Which is most probable?