I agree that the economic principles conflict; you are correct that my question was about the human labor part. I don’t even require that they be substitutes; at the level of abstraction we are working in, it seems perfectly plausible that some new niches will open up. Anything would qualify, even if it is some new-fangled job title like ‘adaptation engineer’ or something that just preps new types of environments for teleoperation before moving onto the next environment like some kine of meta railroad gang. In this case the value of human labor might stay sustainably high in terms of total value, but the amplitude of the value would sort of slide into the few AI relevant niches.
I think this cashes out as Principle A winning out and Principal B winning out looking the same for most people.
But I don’t think that lesson generalizes because of an argument Eliezer makes all the time: the technologies created by evolution (e.g. animals) can do things that current human technology cannot. E.g. humans cannot currently make a self-contained “artificial cow” that can autonomously turn grass and water into more copies of itself, while also creating milk, etc. But that’s an artifact of our current immature technology situation, and we shouldn’t expect it to last into the superintelligence era, with its more advanced future technology.
Separately, I don’ t think “preps new types of environments for teleoperation” is a good example of a future human job. Teleoperated robots can string ethernet cables and install wifi and whatever just like humans can. By analogy, humans have never needed intelligent extraterrestrials to come along and “prep new types of environments for human operation”. Rather, we humans have always been able to bootstrap our way into new environments. Why don’t you expect AGIs to be able to do that too?
(I understand that it’s possible to believe that there will be economic niches for humans, because of more abstract reasons, even if we can’t name even a single plausible example right now. But still, not being able to come up with any plausible examples is surely a bad sign.)
I agree that the economic principles conflict; you are correct that my question was about the human labor part. I don’t even require that they be substitutes; at the level of abstraction we are working in, it seems perfectly plausible that some new niches will open up. Anything would qualify, even if it is some new-fangled job title like ‘adaptation engineer’ or something that just preps new types of environments for teleoperation before moving onto the next environment like some kine of meta railroad gang. In this case the value of human labor might stay sustainably high in terms of total value, but the amplitude of the value would sort of slide into the few AI relevant niches.
I think this cashes out as Principle A winning out and Principal B winning out looking the same for most people.
I looked it up, evidently mules still have at least one tiny economic niche in the developed world. Go figure :)
But I don’t think that lesson generalizes because of an argument Eliezer makes all the time: the technologies created by evolution (e.g. animals) can do things that current human technology cannot. E.g. humans cannot currently make a self-contained “artificial cow” that can autonomously turn grass and water into more copies of itself, while also creating milk, etc. But that’s an artifact of our current immature technology situation, and we shouldn’t expect it to last into the superintelligence era, with its more advanced future technology.
Separately, I don’ t think “preps new types of environments for teleoperation” is a good example of a future human job. Teleoperated robots can string ethernet cables and install wifi and whatever just like humans can. By analogy, humans have never needed intelligent extraterrestrials to come along and “prep new types of environments for human operation”. Rather, we humans have always been able to bootstrap our way into new environments. Why don’t you expect AGIs to be able to do that too?
(I understand that it’s possible to believe that there will be economic niches for humans, because of more abstract reasons, even if we can’t name even a single plausible example right now. But still, not being able to come up with any plausible examples is surely a bad sign.)
I do, I just expect it to take a few iterations. I don’t expect any kind of stable niche for humans after AGI appears.