[Question] Best resource to go from “typical smart tech-savvy person” to “person who gets AGI risk urgency”?


Most tech-savvy people today see AI as a cool tech trend like the internet /​ smartphones /​ crypto /​ virtual reality. The claim “AGI is pretty likely to kill you and your loved ones in your lifetime” has never been a belief they take seriously, or think that many other people take seriously. They also don’t intuitively perceive AGI as a risk that’s relevant in their own lifetime; only as a low-probability long-term risk like an asteroid impact.

The Needed Resource

Something on the internet they can read/​watch to try to guide such a person to the point where they realize maybe AI existential risk is a real urgent thing, at least as much so as climate change. Someone who gets that they might be ringing in the new decade as a nonhuman configuration of atoms.

Benefit of Having This

The benefit is that a significant fraction of the people who get AGI risk urgency will be motivated to do something helpful to the cause, much more so than the fraction of people who don’t get AGI risk urgency.

Some Candidate Resources

Two off the top of my head:

[WaitButWhy] The AI Revolution: Our Immortality or Extinction

[LessWrong] AGI Ruin: A List of Lethalities

The WaitButWhy post does a great job of being generally accessible, while List of Lethalities packs in more of the arguments why the danger level seems high. It seems like there’s a gap for a resource that’s more suitable to the task of making a normal person realize this is a real and urgent thing, despite the social Overton Window not yet giving them that message.

If there’s no good resource, I might be up for working on something myself. Maybe a good format would be similar to that of a technical documentation site, where it’s kind of a linear walkthrough but also lets you skip around to sections you’re interested in.

No comments.