Probability that humanity has somehow irreversibly messed up our future within 10 years of building powerful AI: 46%
What’s a short phrase that captures this? I’ve been using “AI-related x-risk” or just “AI x-risk” or “AI risk” but it sounds like you might disagree with using some or all of these phrases for this purpose (since most of this 46% isn’t “from AI” in your perspective)?
(BTW it seems that we’re not as far part as I thought. My own number for this is 80-90% and I thought yours was closer to 20% than 50%.)
What’s a short phrase that captures this? I’ve been using “AI-related x-risk” or just “AI x-risk” or “AI risk” but it sounds like you might disagree with using some or all of these phrases for this purpose (since most of this 46% isn’t “from AI” in your perspective)?
(BTW it seems that we’re not as far part as I thought. My own number for this is 80-90% and I thought yours was closer to 20% than 50%.)
Maybe x-risk driven by explosive (technological) growth?
Edit: though some people think AI point of no return might happen before the growth explosion.
AI-induced problems/risks