I don’t think it’s clear on longtermist grounds. Some possibilities:
If you think that the amount of resources used on mundane human welfare post-singualarity is constant, then adding the Zambian child to the population leads to a slight decrease in the lifespan of the rest of the population, so it’s zero-sum.
If you think that the amount of resources scales with population, then the child takes resources from the pool of resources which will be spent on stuff that isn’t mundane human welfare, so it might reduce the amount of Hedonium (if you care about that).
If you think that the lightcone will basically be spent on the CEV of the humans that exist around the singularity, you might worry that the marginal child’s vote will make the CEV worse.
(I’m not sure what my bottom line view is.)
In general, I worry that we’re basically clueless about the long-run consequences of most neartermist interventions.
Thanks for these considerations, I’ll ponder on them more later.
Here are my immediate thoughts:
If you think that the amount of resources used on mundane human welfare post-singualarity is constant, then adding the Zambian child to the population leads to a slight decrease in the lifespan of the rest of the population, so it’s zero-sum.
Hmm, this is true on impersonal ethics, in which the only moral consideration is maximising pleasurable person-moments. On such a view, you are morally neutral about killing 1000 infants and replacing them with people with the same welfare. But this violates common sense morality. And I think you should have some credence (under moral uncertainty) that this is bad.
If you think that the lightcone will basically be spent on the CEV of the humans that exist around the singularity, you might worry that the marginal child’s vote will make the CEV worse.
Hmm, this doesn’t seem clear-cut, certainly not enough to justify deviating so strongly from common-sense morality.
Just naively, it sounds crazy to me.
This consideration assumes that the child you save from malaria cares less about hedonium (or whatever weird thing EA’s care about) than the average person. However, you might naively expect that they will care more about hedonium because they actually owe their lives to EA whereas almost no one else does.
This consideration assumes that the CEV is weighted equally among all humans, rather than weighted by wealth. If you assume it’s weighted by wealth then the GiveDirectly donation has the same impact on CEV as the AMF donation.
This consideration predicts that someone is incentivised to kill as many people as possible just before the CEV procedure is executed. But a CEV procedure which incentivised people to murder would be terrible, so we wouldn’t run it. We are more likely to run a CEV procedure which rewards people for saving the lives of the participants of the CEV.
I don’t think it’s clear on longtermist grounds. Some possibilities:
If you think that the amount of resources used on mundane human welfare post-singualarity is constant, then adding the Zambian child to the population leads to a slight decrease in the lifespan of the rest of the population, so it’s zero-sum.
If you think that the amount of resources scales with population, then the child takes resources from the pool of resources which will be spent on stuff that isn’t mundane human welfare, so it might reduce the amount of Hedonium (if you care about that).
If you think that the lightcone will basically be spent on the CEV of the humans that exist around the singularity, you might worry that the marginal child’s vote will make the CEV worse.
(I’m not sure what my bottom line view is.)
In general, I worry that we’re basically clueless about the long-run consequences of most neartermist interventions.
Thanks for these considerations, I’ll ponder on them more later.
Here are my immediate thoughts:
Hmm, this is true on impersonal ethics, in which the only moral consideration is maximising pleasurable person-moments. On such a view, you are morally neutral about killing 1000 infants and replacing them with people with the same welfare. But this violates common sense morality. And I think you should have some credence (under moral uncertainty) that this is bad.
Hmm, this doesn’t seem clear-cut, certainly not enough to justify deviating so strongly from common-sense morality.
Just naively, it sounds crazy to me.
This consideration assumes that the child you save from malaria cares less about hedonium (or whatever weird thing EA’s care about) than the average person. However, you might naively expect that they will care more about hedonium because they actually owe their lives to EA whereas almost no one else does.
This consideration assumes that the CEV is weighted equally among all humans, rather than weighted by wealth. If you assume it’s weighted by wealth then the GiveDirectly donation has the same impact on CEV as the AMF donation.
This consideration predicts that someone is incentivised to kill as many people as possible just before the CEV procedure is executed. But a CEV procedure which incentivised people to murder would be terrible, so we wouldn’t run it. We are more likely to run a CEV procedure which rewards people for saving the lives of the participants of the CEV.