It is very unlikely AI causes an existential catastrophe (Bostrom or Ord definition) but doesn’t result in human extinction. (That is, non-extinction AI x-risk scenarios are unlikely)
It is very unlikely AI causes an existential catastrophe (Bostrom or Ord definition) but doesn’t result in human extinction. (That is, non-extinction AI x-risk scenarios are unlikely)