The statement does not mention existential risk, but rather “the risk of extinction from AI”.
Which makes it an existential risk.
“An existential risk is any risk that has the potential to eliminate all of humanity or, at the very least, kill large swaths of the global population.”—FLI
The statement does not mention existential risk, but rather “the risk of extinction from AI”.
Which makes it an existential risk.
“An existential risk is any risk that has the potential to eliminate all of humanity or, at the very least, kill large swaths of the global population.”—FLI