I’ve interpreted doom as a very bad thing happening to humanity. So this could even mean a global catastrophe, which is sometimes defined as a 10% population loss. If that were the case and if one thought that that any disempowerment would result in extinction, you could actually have a higher P(doom from AI) than P(X catastrophe from AI), the opposite of the OP.
I’ve interpreted doom as a very bad thing happening to humanity. So this could even mean a global catastrophe, which is sometimes defined as a 10% population loss. If that were the case and if one thought that that any disempowerment would result in extinction, you could actually have a higher P(doom from AI) than P(X catastrophe from AI), the opposite of the OP.