I was going to use the comparison “Humans born mentally handicapped to the point that their cognitive function is equivalent to chimps.” (This avoids the potential issue of “babies grow up to be average humans.”)
It is not clear to me how that avoids the issue of including the future.
It avoids the issue of including the future of particular people. Some people care about that, others don’t, but it reduces the range of reasons you might object to the comparison.
From what I know, I personally weight chimps as maybe 1⁄3 times as morally significant as humans. I’m sometimes willing to sacrifice humans to save other humans, and I’d sacrifice a chimp to save about 1⁄3 as many humans. (I’d also sacrifice a human to save 3x as many chimps). This is mostly an intuitive belief. I can imagine myself changing the number to something as low as 1/10th, maybe even as low as 1/100th (I don’t expect to drop it that far).
It’s important to note, though, that I DON’T sacrifice humans on a 1-for-1 trade off without their consent. I don’t want to live in a world where someone can sacrifice me without me having a say in the matter. There may be cases where I’m willing to consent to sacrifice. I’m not sure if I can identify them right now.
There are still circumstances where, while pissed, I’d grudgingly accept that the Mastermind doing the sacrificing was right to do so. (If they had to divert a train that was going to kill a lot of people, for example. Probably more than 5 though). The number of lives saved to be worth it also has to consider how perfect the information is, and the likelihood that the sacrificer isn’t running on damaged hardware.
So theoretically, I’m okay with sacrificing chimps to save arbitrarily large numbers of people, but because the chimps CAN’T consent, I’d have to be willing to sacrifice somewhere between 1⁄3 and 1/10th as many humans to accomplish the same thing.
I read your post and tried to come up with an ‘exchange rate’ of my own, and it was much more difficult to do than I thought it would be before I tried it. I thought that it would be along the lines of thousands/hundreds of thousands of chimps == 1 human, as I couldn’t conceive of letting one human die in exchange for any smaller number of chimps, but then I realized that it would be much easier to think of dead chimps as an opportunity cost, and was just reacting with instinctual revulsion. This is assuming that dead chimps can’t be used (to the same extent) as live chimps to aid in medical research.)
So, what is the current value that we place on the life of a chimp?
If after m (successful) studies each using n chimps, we can save l human lives, then (assuming in worst-case that each study kills n chimps):
(mn)(The value of a chimp life in utilons) = l(The value of a human life in utilons)
So: (mn)/l = The value of a human life/The value of a chimp life
This estimate is going to be higher than in real life, as we don’t kill all the chimps used in a typical study. The difficulty would be in quantifying the number of studies necessary to save a human life, or the number of lives saved by a particular discovery.
However, thinking this way, I would place my ‘exchange rate’ on the order of 200-300 chimps to 1 human life; if necessary, we should let 1 human die so that 300 chimps might live so that their value as test subjects could be used to save other humans.
I just don’t think chimps are intelligent enough to have significant lives on the same order of magnitude as that of a human’s life; I think that 1⁄3 or 1/10th of a human’s life is much too high a value.
However, thinking this way, I would place my ‘exchange rate’ on the order of 200-300 chimps to 1 human life; if necessary, we should let 1 human die so that 300 chimps might live so that their value as test subjects could be used to save other humans.
I just don’t think chimps are intelligent enough to have significant lives on the same order of magnitude as that of a human’s life; I think that 1⁄3 or 1/10th of a human’s life is much too high a value.
Have you corrected for your estimate of p(chimps are uplifted in the next fifty years)?
Edit: Okay, if it makes a difference I only realized the Planet of the Apes reference after I posted, I was making a serious point about the difference between human toddlers and chimps as it relates to the possibility of future personhood.
I hadn’t considered the possibility that chimps could/would be uplifted in the near future (50 years or mean chimp lifetime is a good rule of thumb); I think it’s entirely possible that the technology would be there, but I don’t understand the motivation for wanting to uplift chimps. I guess the reasoning is that more sapient beings == more interesting conversations, more math proofs, more works of art, so more Fun, but I’m not sure that we would want to uplift chimps if we had the technology to do so.
If we had the technology to uplift a species, I think it would be likely that we had the technology to have FAI or uploaded human brains, which would be a more efficient way to have more sapient beings with which to talk. Is it immoral to leave other species the way they are if transhumanism or FAI take off?
If we had the technology to uplift a species, I think it would be likely that we had the technology to have FAI or uploaded human brains, which would be a more efficient way to have more sapient beings with which to talk.
This seems strange to me. Can you expand on your reasoning? Uplifting seems to me to be potentially a lot simpler. The take level to identify the genes that are most responsible for human intelligence is not that much beyond our current one. And the example species you’ve used, chimps, are close enough to humans that it is likely that for at least some of those genes, simply inserting them into the chimp genome would likely substantially increase their intelligence.
Uplifting seems orders of magnitude easier than uploading at least.
I’ll concede that you are probably right about uplifting being easier.
This was my reasoning:
Properly identifying which gene encodes for what and usefully altering genes to express a particular phenotype as complex as human-level intelligence would require (in any reasonable amount of time) at the least a narrow AI to process and refine the huge amount of data in the half-chromosome or so that separates us from chimps. Chimps are close to humans, yes, but altering their DNA to uplift them seems to me to be the type of problem that would either take years of Manhattan-Project level dedication with the technology we have right now, or some sort of AI to do the heavy lifting for us.
I think I’m way out of my depth here, though, as I don’t know enough about genetic engineering or AI research to know with confidence which would be easier.
It is not clear to me how that avoids the issue of including the future.
It avoids the issue of including the future of particular people. Some people care about that, others don’t, but it reduces the range of reasons you might object to the comparison.
From what I know, I personally weight chimps as maybe 1⁄3 times as morally significant as humans. I’m sometimes willing to sacrifice humans to save other humans, and I’d sacrifice a chimp to save about 1⁄3 as many humans. (I’d also sacrifice a human to save 3x as many chimps). This is mostly an intuitive belief. I can imagine myself changing the number to something as low as 1/10th, maybe even as low as 1/100th (I don’t expect to drop it that far).
It’s important to note, though, that I DON’T sacrifice humans on a 1-for-1 trade off without their consent. I don’t want to live in a world where someone can sacrifice me without me having a say in the matter. There may be cases where I’m willing to consent to sacrifice. I’m not sure if I can identify them right now.
There are still circumstances where, while pissed, I’d grudgingly accept that the Mastermind doing the sacrificing was right to do so. (If they had to divert a train that was going to kill a lot of people, for example. Probably more than 5 though). The number of lives saved to be worth it also has to consider how perfect the information is, and the likelihood that the sacrificer isn’t running on damaged hardware.
So theoretically, I’m okay with sacrificing chimps to save arbitrarily large numbers of people, but because the chimps CAN’T consent, I’d have to be willing to sacrifice somewhere between 1⁄3 and 1/10th as many humans to accomplish the same thing.
I read your post and tried to come up with an ‘exchange rate’ of my own, and it was much more difficult to do than I thought it would be before I tried it. I thought that it would be along the lines of thousands/hundreds of thousands of chimps == 1 human, as I couldn’t conceive of letting one human die in exchange for any smaller number of chimps, but then I realized that it would be much easier to think of dead chimps as an opportunity cost, and was just reacting with instinctual revulsion. This is assuming that dead chimps can’t be used (to the same extent) as live chimps to aid in medical research.)
So, what is the current value that we place on the life of a chimp? If after m (successful) studies each using n chimps, we can save l human lives, then (assuming in worst-case that each study kills n chimps): (mn)(The value of a chimp life in utilons) = l(The value of a human life in utilons) So: (mn)/l = The value of a human life/The value of a chimp life
This estimate is going to be higher than in real life, as we don’t kill all the chimps used in a typical study. The difficulty would be in quantifying the number of studies necessary to save a human life, or the number of lives saved by a particular discovery.
However, thinking this way, I would place my ‘exchange rate’ on the order of 200-300 chimps to 1 human life; if necessary, we should let 1 human die so that 300 chimps might live so that their value as test subjects could be used to save other humans.
I just don’t think chimps are intelligent enough to have significant lives on the same order of magnitude as that of a human’s life; I think that 1⁄3 or 1/10th of a human’s life is much too high a value.
Have you corrected for your estimate of p(chimps are uplifted in the next fifty years)?
Edit: Okay, if it makes a difference I only realized the Planet of the Apes reference after I posted, I was making a serious point about the difference between human toddlers and chimps as it relates to the possibility of future personhood.
I hadn’t considered the possibility that chimps could/would be uplifted in the near future (50 years or mean chimp lifetime is a good rule of thumb); I think it’s entirely possible that the technology would be there, but I don’t understand the motivation for wanting to uplift chimps. I guess the reasoning is that more sapient beings == more interesting conversations, more math proofs, more works of art, so more Fun, but I’m not sure that we would want to uplift chimps if we had the technology to do so.
If we had the technology to uplift a species, I think it would be likely that we had the technology to have FAI or uploaded human brains, which would be a more efficient way to have more sapient beings with which to talk. Is it immoral to leave other species the way they are if transhumanism or FAI take off?
This seems strange to me. Can you expand on your reasoning? Uplifting seems to me to be potentially a lot simpler. The take level to identify the genes that are most responsible for human intelligence is not that much beyond our current one. And the example species you’ve used, chimps, are close enough to humans that it is likely that for at least some of those genes, simply inserting them into the chimp genome would likely substantially increase their intelligence.
Uplifting seems orders of magnitude easier than uploading at least.
I’ll concede that you are probably right about uplifting being easier.
This was my reasoning: Properly identifying which gene encodes for what and usefully altering genes to express a particular phenotype as complex as human-level intelligence would require (in any reasonable amount of time) at the least a narrow AI to process and refine the huge amount of data in the half-chromosome or so that separates us from chimps. Chimps are close to humans, yes, but altering their DNA to uplift them seems to me to be the type of problem that would either take years of Manhattan-Project level dedication with the technology we have right now, or some sort of AI to do the heavy lifting for us.
I think I’m way out of my depth here, though, as I don’t know enough about genetic engineering or AI research to know with confidence which would be easier.
[Edited for typos.]