Does anyone else feel like export control discussion at the moment is a bit hyperbolic? I don’t expect human level AGI to be achieved for about 5 years and by then the current iterations of GPUs will be obsolete. It certainly doesn’t give me confidence in this administration’s ability to do things, but selling China advanced chips now is probably fine if we stop in like 3 years. Which hopefully we will.
Szeth
Karma: −2
Great work as always. I’m not sure if I agree that we should be focusing on flourishing, conditional on survival. I think a bigger risk would be risks of astronomical suffering which seem like almost the default outcome. Eg digital minds, wild animals in space colonization, and unknown-unknowns. It’s possible that the interventions would be overlapping but I am skeptical.
I also don’t love the citations for a low p(doom). Toby Ord’s guess was from 2020, the Super Forecaster survey from 2022 and prediction markets aren’t really optimized for this sort of question. Something like Eli Lifland’s guess or the AI Impacts Surveys are where I would start as a jumping off point.
Am I stupid why does this cut off at the end? Assuming that’s a concious choice