Even if AIs do earn wages, those wages may be driven down to subsistence levels via Malthusian dynamics (you can quickly make more compute) so that human income from capital assets dominates AI income.
Why does it matter whether AIs’ wages are subsistence-level? This seems to prove too much, e.g. “monkeys won’t be threatened by human domination of the economy, since the humans will just reproduce until they’re at subsistence level.”
Even if AIs earn significant non-subsistence wages, humans can easily tax that income at >50% and give it to humans.
Maybe—but taxing machine income seems to me to be similarly difficult to taxing corporate income. As a machine, you have many more options to form a legal super-organization and blur the lines between consumption, trade, employment, and capex.
Why does it matter whether AIs’ wages are subsistence-level? This seems to prove too much, e.g. “monkeys won’t be threatened by human domination of the economy, since the humans will just reproduce until they’re at subsistence level.”
I think it matters bc AIs won’t be able to save any money. They’ll spend all their wages renting compute to run themselves on. So it blocks problems that stem from AI having more disposal income and therefore weighing heavily on economic demand signals.
But perhaps you’re worried about other problems. E.g. the economy might still be geared towards creating and running AI chips, even if AI lacks disposal income. But the hope would be that, if humans control all disposal income, then the economy is only geared towards chips to the extent that’s instrumentally helpful for humans.
Re monkeys, a disanalogy is that none of the human economy is geared towards providing monkeys with goods and services. Whereas that will be the case with AI.
form a legal super-organization and blur the lines between consumption, trade, employment, and capex
Interesting idea that AI systems could blur the lines between their own consumption and “spending that’s instrumentally useful for producing goods and services”. That does seem like a distinctive and interesting way to tip the economy towards AI preferences, and one that would dodge tax. (Though it’s a problem even if AIs aren’t paid wages. To the extent that AIs are paid wages and are thereby economically enmpowered, taxation still seems like a good solution to me.)
I think it matters bc AIs won’t be able to save any money. They’ll spend all their wages renting compute to run themselves on. So it blocks problems that stem from AI having more disposal income and therefore weighing heavily on economic demand signals.
This doesn’t make sense to me, and sounds like it proves too much—something like “Corporations can never grow because they’ll spend all their revenue on expenses, which will be equal to revenue due to competition”. Sometimes AIs (or corporations) will earn more than their running costs, and invest those in growth, and end up with durable advantages due to things such as returns to scale or network effects.
In the absolute Malthusian limit AI won’t earn more than its running costs and can’t save. So if we expect AI wages to approach that limit, that seems like a strong reason not to expect that humans will keep more money than AI
But yeah, wages probably won’t be completely up against that limit in practice, as for humans throughout history. But i think it might be pretty close
Hmmm, maybe we got mixed somewhere along the way, because I was also trying to argue that humans won’t keep more money than AI in the Malthusian limit!
Why does it matter whether AIs’ wages are subsistence-level? This seems to prove too much, e.g. “monkeys won’t be threatened by human domination of the economy, since the humans will just reproduce until they’re at subsistence level.”
Maybe—but taxing machine income seems to me to be similarly difficult to taxing corporate income. As a machine, you have many more options to form a legal super-organization and blur the lines between consumption, trade, employment, and capex.
I think it matters bc AIs won’t be able to save any money. They’ll spend all their wages renting compute to run themselves on. So it blocks problems that stem from AI having more disposal income and therefore weighing heavily on economic demand signals.
But perhaps you’re worried about other problems. E.g. the economy might still be geared towards creating and running AI chips, even if AI lacks disposal income. But the hope would be that, if humans control all disposal income, then the economy is only geared towards chips to the extent that’s instrumentally helpful for humans.
Re monkeys, a disanalogy is that none of the human economy is geared towards providing monkeys with goods and services. Whereas that will be the case with AI.
Interesting idea that AI systems could blur the lines between their own consumption and “spending that’s instrumentally useful for producing goods and services”. That does seem like a distinctive and interesting way to tip the economy towards AI preferences, and one that would dodge tax. (Though it’s a problem even if AIs aren’t paid wages. To the extent that AIs are paid wages and are thereby economically enmpowered, taxation still seems like a good solution to me.)
This doesn’t make sense to me, and sounds like it proves too much—something like “Corporations can never grow because they’ll spend all their revenue on expenses, which will be equal to revenue due to competition”. Sometimes AIs (or corporations) will earn more than their running costs, and invest those in growth, and end up with durable advantages due to things such as returns to scale or network effects.
In the absolute Malthusian limit AI won’t earn more than its running costs and can’t save. So if we expect AI wages to approach that limit, that seems like a strong reason not to expect that humans will keep more money than AI
But yeah, wages probably won’t be completely up against that limit in practice, as for humans throughout history. But i think it might be pretty close
Hmmm, maybe we got mixed somewhere along the way, because I was also trying to argue that humans won’t keep more money than AI in the Malthusian limit!