some people (not on this website) seem to say “existential risk” in ways that seem to imply they don’t think it means “extinction risk”. perhaps literally saying “extinction risk” might be less ambiguous.
A salient and arguably valid non-extinction meaning for “existential threat” is “the firm will go bust”. So currently many companies are considering if failing to use AI could be existential. Economic and national security implications make this meaning applicable to countries, thus governments are considering if failing to do well in the AI race could be existential.
If you’re playing “inside game” strategy (do alignment research on behalf of top AI labs and lobby US govt on behalf of top AI labs), jargon might be good. If you’re playing “outside game” strategy (mass protest, election campaign), jargon might be bad.
I agree. I think “existential” basically isn’t enough common parlance for most people to not just round it off to “big”, in the same way that “literally” becomes “very”.
I’ve always taken “existential” risk to cover more than just extinction. Bostrom defines it as something like permanent destruction of human potential. So, a scenario where you have a population of living humans but only in a zoo is “existential” but not “extinction.”
some people (not on this website) seem to say “existential risk” in ways that seem to imply they don’t think it means “extinction risk”. perhaps literally saying “extinction risk” might be less ambiguous.
A salient and arguably valid non-extinction meaning for “existential threat” is “the firm will go bust”. So currently many companies are considering if failing to use AI could be existential. Economic and national security implications make this meaning applicable to countries, thus governments are considering if failing to do well in the AI race could be existential.
If you’re playing “inside game” strategy (do alignment research on behalf of top AI labs and lobby US govt on behalf of top AI labs), jargon might be good. If you’re playing “outside game” strategy (mass protest, election campaign), jargon might be bad.
ambiguous jargon doesn’t seem beneficial
I agree. I think “existential” basically isn’t enough common parlance for most people to not just round it off to “big”, in the same way that “literally” becomes “very”.
Yup this is similar to what I’ve heard ControlAI found in their briefings with policymakers, though I’m not sure it made its way into the writeup. Many people don’t know what the word existential means! https://open.substack.com/pub/leticiagarciamartinez/p/what-we-learned-from-briefing-70
I promoted that for a long time (example from 2008). But I guess Nick Bostrom had more influence :-)
I’ve always taken “existential” risk to cover more than just extinction. Bostrom defines it as something like permanent destruction of human potential. So, a scenario where you have a population of living humans but only in a zoo is “existential” but not “extinction.”