Despite a limited number of jobs, humans remain critical to the economy as consumers. If we don’t keep a large consumer base, the entire economic system collapses and we no longer have the money to fund AI.
Extremely common fallacy, nevertheless rather easily seen to be wrong! Instead, whether the rich keep their money or the poor get part of it: Anyone who wants to earn money/profits, wants it only because they can use it one way or other. Therefore, the real economy does not collapse just because we don’t redistribute. It just produces different things: the airplanes, palaces, rockets, skying domes in the desert or what have you that the rich prefer over the poor’s otherwise demanded cars, house, booze whatever [adjust all examples to your liking to describe what instead rich & poor’d like have a taste for in the AI future]. Even if you’d rebut the rich ‘don’t consume, they just save’, then they’ll greedily save by investing, which means also recycles their revenues into the economy.[1]
Trivially, the rich then also exactly do have the money to fund the AI, if we don’t redistribute.
[Edit: the remainder which can be ignored just describes a fear of mine, which is just that: a fear (with many reasons why its scenario may eventually not play out that way/). It is related to the above but not meant as any substantive claim and it does not impact my actual claim above about the OP making a logical econ mistake.
Because thus ultimately rather obviously none needs us ‘to consume as otherwise the economy collapses’, I fear something in a direction of a tripple whammy of: (i) half impoverished gullible people, (ii) flooded with AI perfectioned controlled social media and fake stories as to why whatever fake thing would be the reason for their increasing misery, and (iii) international competition in a low marginal cost world with mobile productive resources (meaning strong redistribution on a national level is actually not trivial). Conspiring to undermine the natural solution of a generous UBI. So I fear a small ruling elite undermining the prospects for large material gains for the masses—though who knows, maybe we do keep enough mental and physical power to actually ‘demand our +- fair share’ as a population. What makes me pessimistic is that already today we see a relatively small western elite profiteer from worldwide resources with a large share not benefiting commensurately, and clear authocratic populistic tendencies already being supported by social/general media even in advanced countries.]
To preempt a potential confusion: I don not say printing money and handing to the poor would not boost an economy. That can work—at least in the short run anyway—as it’s expansionary fiscal/monetary policy. But this is a very different mechanism from directly transferring from rich to poor.
One of the assumptions I’m making is that if AI dispossesses billions of people, that’s billions of people who can rebel by attacking automation infrastructure. There might be a way to pull off dispossession gently so that by the time anyone thinks to rebel it’s already too late, but I expect less well coordinated action, and instead sudden shocks that will have to be responded to. The only way to prevent violence that threatens the wealth of capital owners will be to find a way to placate the mass of would-be rebels (since doing something like killing everyone who doesn’t have a job or own capital is and will be morally reprehensible and so not a real option), and I expect UBI to be the solution.
@FlorianH I see you reacted that you think I missed your point, but I’m not so sure I did. You seem to be making an argument that an economy can still function even if some actors leave that economy so long as some actors remain, which is of course true, but my broader point is about sustaining a level of consumption necessary for growth, and a fully automated economy could quickly reach the limits of its capacity to produce (and the wealth of the remaining consumers) if there are very few consumers. I expect to need a large base of consumers for there to be sufficient growth to justify the high costs of accelerating automation.
Agree with a lot.
One point I consider a classic mistake:
Extremely common fallacy, nevertheless rather easily seen to be wrong! Instead, whether the rich keep their money or the poor get part of it: Anyone who wants to earn money/profits, wants it only because they can use it one way or other. Therefore, the real economy does not collapse just because we don’t redistribute. It just produces different things: the airplanes, palaces, rockets, skying domes in the desert or what have you that the rich prefer over the poor’s otherwise demanded cars, house, booze whatever [adjust all examples to your liking to describe what instead rich & poor’d like have a taste for in the AI future]. Even if you’d rebut the rich ‘don’t consume, they just save’, then they’ll greedily save by investing, which means also recycles their revenues into the economy.[1]
Trivially, the rich then also exactly do have the money to fund the AI, if we don’t redistribute.
[Edit: the remainder which can be ignored just describes a fear of mine, which is just that: a fear (with many reasons why its scenario may eventually not play out that way/). It is related to the above but not meant as any substantive claim and it does not impact my actual claim above about the OP making a logical econ mistake.
Because thus ultimately rather obviously none needs us ‘to consume as otherwise the economy collapses’, I fear something in a direction of a tripple whammy of: (i) half impoverished gullible people, (ii) flooded with AI perfectioned controlled social media and fake stories as to why whatever fake thing would be the reason for their increasing misery, and (iii) international competition in a low marginal cost world with mobile productive resources (meaning strong redistribution on a national level is actually not trivial). Conspiring to undermine the natural solution of a generous UBI. So I fear a small ruling elite undermining the prospects for large material gains for the masses—though who knows, maybe we do keep enough mental and physical power to actually ‘demand our +- fair share’ as a population. What makes me pessimistic is that already today we see a relatively small western elite profiteer from worldwide resources with a large share not benefiting commensurately, and clear authocratic populistic tendencies already being supported by social/general media even in advanced countries.]
To preempt a potential confusion: I don not say printing money and handing to the poor would not boost an economy. That can work—at least in the short run anyway—as it’s expansionary fiscal/monetary policy. But this is a very different mechanism from directly transferring from rich to poor.
One of the assumptions I’m making is that if AI dispossesses billions of people, that’s billions of people who can rebel by attacking automation infrastructure. There might be a way to pull off dispossession gently so that by the time anyone thinks to rebel it’s already too late, but I expect less well coordinated action, and instead sudden shocks that will have to be responded to. The only way to prevent violence that threatens the wealth of capital owners will be to find a way to placate the mass of would-be rebels (since doing something like killing everyone who doesn’t have a job or own capital is and will be morally reprehensible and so not a real option), and I expect UBI to be the solution.
@FlorianH I see you reacted that you think I missed your point, but I’m not so sure I did. You seem to be making an argument that an economy can still function even if some actors leave that economy so long as some actors remain, which is of course true, but my broader point is about sustaining a level of consumption necessary for growth, and a fully automated economy could quickly reach the limits of its capacity to produce (and the wealth of the remaining consumers) if there are very few consumers. I expect to need a large base of consumers for there to be sufficient growth to justify the high costs of accelerating automation.