Total deaths of young people in the US is small, in relative terms, so there’s not much room for impact. There would still be some impact; we can’t tell from this graph of course, but many of the diseases listed could probably be quite substantially derisked (cardio, neoplasms, respiratory).
This is only deaths, so there’s more impact if you include non-lethal cases of illness. IDK how much of this you can impact with reprogenetics, especially since uptake would take a long time.
where we will have radically different medical capabilities if AGI happens in the next two decades?
Well, on my view, if actual AGI (general intelligence that’s smarter than humans in every way including deep things like scientific and technological creativity) happens, we’re quite likely to all die very soon after. But yeah, if you don’t think that, then on your view AGI would plausibly obsolete any current scientific work including reprogenetics, IDK.
Another thing to point out is that, if this is a motive for making AGI, then reprogenetics could (legitimately!) demotivate AGI capabilities research, which would decrease X-risk.
It means not being very confident that AGI happens within two decades, yeah. Cf. https://www.lesswrong.com/posts/sTDfraZab47KiRMmT/views-on-when-agi-comes-and-on-strategy-to-reduce and https://www.lesswrong.com/posts/5tqFT3bcTekvico4d/do-confident-short-timelines-make-sense
Yes.
Someone could do a research project to guesstimate the impact more precisely. As one touchpoint, here’s 2021 US causes of death, per the CDC:
(From https://wisqars.cdc.gov/pdfs/leading-causes-of-death-by-age-group_2021_508.pdf )
Total deaths of young people in the US is small, in relative terms, so there’s not much room for impact. There would still be some impact; we can’t tell from this graph of course, but many of the diseases listed could probably be quite substantially derisked (cardio, neoplasms, respiratory).
This is only deaths, so there’s more impact if you include non-lethal cases of illness. IDK how much of this you can impact with reprogenetics, especially since uptake would take a long time.
Well, on my view, if actual AGI (general intelligence that’s smarter than humans in every way including deep things like scientific and technological creativity) happens, we’re quite likely to all die very soon after. But yeah, if you don’t think that, then on your view AGI would plausibly obsolete any current scientific work including reprogenetics, IDK.
Another thing to point out is that, if this is a motive for making AGI, then reprogenetics could (legitimately!) demotivate AGI capabilities research, which would decrease X-risk.
You don’t need to be smarter in every possible way to get radically increase in speed to solve illnesses.
I think part of the motive of making AGI is to solve all illnesses for everyone and not just people who aren’t yet born.
You need the scientific and technological creativity part, and the rest would probably flow, is my guess.
What I mean is that giving humanity more brainpower also gets these benefits. See https://tsvibt.blogspot.com/2025/11/hia-and-x-risk-part-1-why-it-helps.html It may take longer than AGI, but also it doesn’t pose a (huge) risk of killing everyone.