Even though the number of product categories is relevant to reconstructing GDP from deconstructed GDP, the number of product categories available is not Quantity in this equation, it’s n.
That’s why you’re not going to get “annual sales of Schmartphones would 60x over the time period, while maintaining constant GDP share” when the rest of the economy is growing at 0%. And why “The growth experienced by consumers however would be linear by any common sense understanding of the situation” is not true: the growth in the number of categories might be linear, but consumers would be buying exponentially more product value (quantity * price).
On the product categories vs quantity: I agree, I intended Schmartphones to be a new category in the post, apologies if that wasn’t clear[1].
That’s why you’re not going to get “annual sales of Schmartphones would 60x over the time period, while maintaining constant GDP share” when the rest of the economy is growing at 0%
I believe this actually is possible, and works in my example, though I accept that it’s not very realistic and the path would look more like the “Trajectory of a newly invented good” graph.
The point is that the GDP shares of the other categories can change while still growing at 0%, because growth is just based on the quantity of goods delivered. Going through the example again with more explicit numbers (and changing growth to 100% YoY to simplify it):
t = 0: There are n=9 types of goods, 1000 units are sold of each, And they all have the same price of $1, so each gets 1/9th GDP share. New good has been invented but has 0% GDP share so far.
t = 1: The new type of good sells 2 units, at $500 each, so it goes straight to GDP share parity with all the other goods, and each now have 1/(9+1) = 1/10 GDP share. But each of the other categories still sells 1000 units, so they are all at 0% growth, and per the formula contribute nothing to GDP growth overall.
t = 2: The new type of code sells 4 units, at $250. It maintains its 1/10 GDP share, grows by 100%, and adds 10% to overall GDP growth. The other types of goods still sell 1000 units each and contribute nothing to GDP growth.
^perhaps an important point glossed over in the post is that the price of the new good has to go down by the inverse of the growth rate. And as mentioned briefly in the footnote, I think the simplest way to imagine this type of economy is that people work more hours to provide the new type of good, while otherwise changing nothing about what they do for the other goods.
Sidenote: This setup would keep nominal GDP constant, and the economy overall would be judged as being in steep deflation because you can buy more and more “product value” in the form of the new good, for the same nominal dollars.
the growth in the number of categories might be linear, but consumers would be buying exponentially more product value (quantity * price).
I think this gets to the crux of it, and is the main point I’m trying to make: Using either fixed base year accounting or chain weighting, the value of a new product essentially gets baked into GDP at the [price when it is introduced] x [quantity when it is mature]. If the price at the time of introduction is proportional to GDP at that time (say, people always spend 1% of their income on the new gizmo, or 1% of people have a ton of disposable income and are price insensitive), then a linear flow of new goods will translate into an exponential increase in GDP.
Standard economics interprets this as an exponential increase in product value over time. This is reasonable, and I think this assumption is well understood by economists going through this calculation.
The main reason I think this is an interesting observation is that there is some laundering that happens between the calculation and the practical conception of GDP. People (including economists) often conflate exponential growth in GDP with exponential growth in ability to have impact on the world. For instance “if the economy keeps doubling every 30 years, we need to have tiled the light cone or hard plateaued within a few hundred years”. But the calculation supports this continued exponential doubling with only a linear stream of new goods, and these goods may not be of the type that let you tile the light cone.
Whether this mechanism explains a large or small fraction of measured GDP growth over the last ~100 years is an empirical question. I’m not claiming that the exponential growth comes from linear introduction of new goods, just that the accounting + a plausible trajectory for new goods permits this.
Also: I am aware that the way categories are counted practically means that a new good would probably fit into an existing category (e.g. the 71 the US uses) rather than creating a whole new one. I think this still works because within categories the same logic applies
Even though the number of product categories is relevant to reconstructing GDP from deconstructed GDP, the number of product categories available is not Quantity in this equation, it’s n.
That’s why you’re not going to get “annual sales of Schmartphones would 60x over the time period, while maintaining constant GDP share” when the rest of the economy is growing at 0%. And why “The growth experienced by consumers however would be linear by any common sense understanding of the situation” is not true: the growth in the number of categories might be linear, but consumers would be buying exponentially more product value (quantity * price).
On the product categories vs quantity: I agree, I intended Schmartphones to be a new category in the post, apologies if that wasn’t clear[1].
I believe this actually is possible, and works in my example, though I accept that it’s not very realistic and the path would look more like the “Trajectory of a newly invented good” graph.
The point is that the GDP shares of the other categories can change while still growing at 0%, because growth is just based on the quantity of goods delivered. Going through the example again with more explicit numbers (and changing growth to 100% YoY to simplify it):
t = 0: There aren=9types of goods, 1000 units are sold of each, And they all have the same price of $1, so each gets 1/9th GDP share. New good has been invented but has 0% GDP share so far.t = 1: The new type of good sells 2 units, at $500 each, so it goes straight to GDP share parity with all the other goods, and each now have1/(9+1) = 1/10GDP share. But each of the other categories still sells 1000 units, so they are all at 0% growth, and per thet = 2: The new type of code sells 4 units, at $250. It maintains its1/10GDP share, grows by 100%, and adds 10% to overall GDP growth. The other types of goods still sell 1000 units each and contribute nothing to GDP growth.^perhaps an important point glossed over in the post is that the price of the new good has to go down by the inverse of the growth rate. And as mentioned briefly in the footnote, I think the simplest way to imagine this type of economy is that people work more hours to provide the new type of good, while otherwise changing nothing about what they do for the other goods.
Sidenote: This setup would keep nominal GDP constant, and the economy overall would be judged as being in steep deflation because you can buy more and more “product value” in the form of the new good, for the same nominal dollars.
I think this gets to the crux of it, and is the main point I’m trying to make: Using either fixed base year accounting or chain weighting, the value of a new product essentially gets baked into GDP at the [price when it is introduced] x [quantity when it is mature]. If the price at the time of introduction is proportional to GDP at that time (say, people always spend 1% of their income on the new gizmo, or 1% of people have a ton of disposable income and are price insensitive), then a linear flow of new goods will translate into an exponential increase in GDP.
Standard economics interprets this as an exponential increase in product value over time. This is reasonable, and I think this assumption is well understood by economists going through this calculation.
The main reason I think this is an interesting observation is that there is some laundering that happens between the calculation and the practical conception of GDP. People (including economists) often conflate exponential growth in GDP with exponential growth in ability to have impact on the world. For instance “if the economy keeps doubling every 30 years, we need to have tiled the light cone or hard plateaued within a few hundred years”. But the calculation supports this continued exponential doubling with only a linear stream of new goods, and these goods may not be of the type that let you tile the light cone.
Whether this mechanism explains a large or small fraction of measured GDP growth over the last ~100 years is an empirical question. I’m not claiming that the exponential growth comes from linear introduction of new goods, just that the accounting + a plausible trajectory for new goods permits this.
Also: I am aware that the way categories are counted practically means that a new good would probably fit into an existing category (e.g. the 71 the US uses) rather than creating a whole new one. I think this still works because within categories the same logic applies