I don’t understand your objection. right now, the cost of replacing a given human with AI is greater than the cost of the human (because the compute is very expensive, the AIs are not very good, etc). over time, the AI gets cheaper and cheaper, until at some point it is precisely as expensive as the human. one day thereafter, AI will be very slightly cheaper than the human. you would prefer to pay for the AI compute instead of the human salary. at this moment in time, it will be economically incentivized to fire all the humans and replace them with AIs. because the AIs still cost almost exactly as much as humans at this moment, it won’t be economical to have substantially more AIs than you had humans the day before, because if it were, then we would have hired more humans in the first place; there must be diminishing returns to quantity of humans employed, and the previous equilibrium is still very close to the new equilibrium. but the amount of new value created for the world due to this switch is very small: only the delta between what the humans used to cost and what the AIs now cost.
Your economics are wrong for a few reasons. Let’s grant the hypothetical where all humans supply homogeneous labor at a uniform wage.
If AI is slightly cheaper than humans, what happen is that wages fall slightly. At the new, lower wages, there is more demand for labor (and more humans drop out of the labor force). At the same time, capital costs are bid up slightly. Eventually the price of AI and human labor is equal, and the quantity demanded is equal to the quantity supplied.
At the same time, you are increasing demand for labor to build the AI (right now labor is ultimately the main input to building all the stuff that goes in datacenters). If the social value of the AI is near zero, then the net increase in demand is almost the same as the net increase in supply. Lowering wages and increasing capital costs doesn’t offset the benefits of extra productive capacity, it just shifts value from laborers to capitalists.
The real fiscal issue in this scenario is that you are shifting output from labor to capital, and the tax rate on capital is lower than the tax on labor. (Moreover as you automate the economy there are further corporate reorganizations that would drive effect tax rates well below the on-paper capital gains rate). You’re doing that at the same time that you are potentially increasing spending, which is tough unless you are willing to adjust the tax code.
I’m inclined to agree with other commenters though that none of this seems like the most important issue. The fiscal issues can be overcome if the state cares, and my best guess is that growth will accelerate enough that it would be OK even if there was no political change.
People should have much bigger concerns about being completely materially disempowered: (i) the state may not continue to support them, either because they are politically disempowered or because the state itself is disempowered, and (ii) even if they are able to survive they will have no say over what the world looks like and that sucks in its own way.
My idea was, maybe the AI company is willing to sell you 1 unit of AI labor at human-competitive price, but if you order 1000 units they’ll ask for a higher price per unit, because they need to build more datacenters or something. In this case replacement of humans will be gradual even if all humans are equally productive. And another possibility is that humans aren’t all equally productive, so AI will first get good enough to replace the worst worker, then the second worst and so on. From these two reasons I get the possibility that by the time lots of people get replaced, the difference in productivity between AI and the average person replaced so far won’t be epsilon. It won’t be the full salary either, but maybe something substantial. Anyway that was it.
I don’t understand your objection. right now, the cost of replacing a given human with AI is greater than the cost of the human (because the compute is very expensive, the AIs are not very good, etc). over time, the AI gets cheaper and cheaper, until at some point it is precisely as expensive as the human. one day thereafter, AI will be very slightly cheaper than the human. you would prefer to pay for the AI compute instead of the human salary. at this moment in time, it will be economically incentivized to fire all the humans and replace them with AIs. because the AIs still cost almost exactly as much as humans at this moment, it won’t be economical to have substantially more AIs than you had humans the day before, because if it were, then we would have hired more humans in the first place; there must be diminishing returns to quantity of humans employed, and the previous equilibrium is still very close to the new equilibrium. but the amount of new value created for the world due to this switch is very small: only the delta between what the humans used to cost and what the AIs now cost.
Your economics are wrong for a few reasons. Let’s grant the hypothetical where all humans supply homogeneous labor at a uniform wage.
If AI is slightly cheaper than humans, what happen is that wages fall slightly. At the new, lower wages, there is more demand for labor (and more humans drop out of the labor force). At the same time, capital costs are bid up slightly. Eventually the price of AI and human labor is equal, and the quantity demanded is equal to the quantity supplied.
At the same time, you are increasing demand for labor to build the AI (right now labor is ultimately the main input to building all the stuff that goes in datacenters). If the social value of the AI is near zero, then the net increase in demand is almost the same as the net increase in supply. Lowering wages and increasing capital costs doesn’t offset the benefits of extra productive capacity, it just shifts value from laborers to capitalists.
The real fiscal issue in this scenario is that you are shifting output from labor to capital, and the tax rate on capital is lower than the tax on labor. (Moreover as you automate the economy there are further corporate reorganizations that would drive effect tax rates well below the on-paper capital gains rate). You’re doing that at the same time that you are potentially increasing spending, which is tough unless you are willing to adjust the tax code.
I’m inclined to agree with other commenters though that none of this seems like the most important issue. The fiscal issues can be overcome if the state cares, and my best guess is that growth will accelerate enough that it would be OK even if there was no political change.
People should have much bigger concerns about being completely materially disempowered: (i) the state may not continue to support them, either because they are politically disempowered or because the state itself is disempowered, and (ii) even if they are able to survive they will have no say over what the world looks like and that sucks in its own way.
My idea was, maybe the AI company is willing to sell you 1 unit of AI labor at human-competitive price, but if you order 1000 units they’ll ask for a higher price per unit, because they need to build more datacenters or something. In this case replacement of humans will be gradual even if all humans are equally productive. And another possibility is that humans aren’t all equally productive, so AI will first get good enough to replace the worst worker, then the second worst and so on. From these two reasons I get the possibility that by the time lots of people get replaced, the difference in productivity between AI and the average person replaced so far won’t be epsilon. It won’t be the full salary either, but maybe something substantial. Anyway that was it.