This sort of utilitarian calculation should be done with something like QALYs, not lives. If the best charities extend life at $150 per QALY, and a $20,000 neuro-suspension extends life by a risk-adjusted 200 QALYs, then purchasing cryonics for yourself would be altruistically utilitarian.
True, but that’s much harder to estimate (because real world QALY data) and involves more uncertainty (how many QALYs to expect after revival?) and I didn’t want that much work—just a quick estimate.
However, I’m guessing someone else has done this properly at some point?
These calculations get really messy because the future civilization reviving you as an upload is unlikely to have their population limited by frozen people to scan. Instead they probably run as many people as they have resources or work for, and if they decide to run you it’s instead of someone else. There are probably no altruistic QALYs in preserving someone for this future.
This sort of utilitarian calculation should be done with something like QALYs, not lives. If the best charities extend life at $150 per QALY, and a $20,000 neuro-suspension extends life by a risk-adjusted 200 QALYs, then purchasing cryonics for yourself would be altruistically utilitarian.
True, but that’s much harder to estimate (because real world QALY data) and involves more uncertainty (how many QALYs to expect after revival?) and I didn’t want that much work—just a quick estimate.
However, I’m guessing someone else has done this properly at some point?
Note: I have not, so do not use my 200 QALYs as an anchor.
These calculations get really messy because the future civilization reviving you as an upload is unlikely to have their population limited by frozen people to scan. Instead they probably run as many people as they have resources or work for, and if they decide to run you it’s instead of someone else. There are probably no altruistic QALYs in preserving someone for this future.
This reply made me really think, and prompted me to ask this question.