The other model is a simple Moore’s law sort of model: H=Aexpkt. Lots of technological things behave kinda like this (but, note, lots of them actually turn out to be sigmoidal: they grow exponentially for a while, then slow down and eventually plateau) and it’s a nice simple default model.
Is it lots, or is it all? To me it seems like the classic equation dHdt=αH(1−H), or even a generalization of it in which H appears with some higher exponent, makes a lot of physical sense IRL to embody the notion that at some point the untapped potential of that technology sort of runs out—and that’s where you get all the sigmoids from. I can’t think of a single technology than can truly be expected to grow exponentially forever, let alone diverge to infinity. The question is usually just how high the ceiling is.
For sure nothing in the real world can really grow exponentially for ever. I don’t know how consistently the failure to grow exponentially for ever looks like a sigmoid rather than, say, an exponential that abruptly runs into a wall, or an exponential that gets “abandoned” before it turns sigmoid because some other thing comes along to take its place.
I’ll tweak the wording in the OP to be clearer about this. [EDITED to add:] Now done.
Well, this applies generally to all these models—why should it look like an exponential or a power law at all to begin with? These are simplifications that are born out of the fact that we can write out these very simple ODE models that reasonably approximate the dynamics and produce meaningful trajectories.
However I think “sigmoid” is definitely the most likely pattern, if we broaden that term to mean not strictly just the logistic function (which is the solution of y′=y(1−y) ) but also any other kind of similar function that has an S shape, possibly not even symmetric. “Running into a wall” is much more unphysical—it implies a discontinuity in the derivative that real processes never exhibit.
Also you could see it as this: all these are special cases of a more general y′=P(y), where P is any polynomial. And that means virtually any analytical function, since those can be Taylor-expanded into polynomials reaching arbitrary accuracy in the neighbourhood of a specific point. So really the only assumptions baked in there are:
the rate of growth is analytical (no weird discontinuities or jumps; reasonable)
the rate of growth does not feature an explicit time dependence (also sensible, as these phenomena should happen equally regardless of which year they were kickstarted in)
Within this framework, the exponential growth is the result of a first order expansion, and the logistic is a second order expansion (under certain conditions for the coefficients). Higher orders, if present, could give rise to more complex models, but generally speaking as far as I can tell they’ll all tend to either converge to a given value (a root of the polynomial) or diverge to infinity. It would be interesting to consider the conditions under which convergence occurs I guess; it should depend on the spectrum of the polynomial but it might have a more physical interpretation.
Is it lots, or is it all? To me it seems like the classic equation dHdt=αH(1−H), or even a generalization of it in which H appears with some higher exponent, makes a lot of physical sense IRL to embody the notion that at some point the untapped potential of that technology sort of runs out—and that’s where you get all the sigmoids from. I can’t think of a single technology than can truly be expected to grow exponentially forever, let alone diverge to infinity. The question is usually just how high the ceiling is.
For sure nothing in the real world can really grow exponentially for ever. I don’t know how consistently the failure to grow exponentially for ever looks like a sigmoid rather than, say, an exponential that abruptly runs into a wall, or an exponential that gets “abandoned” before it turns sigmoid because some other thing comes along to take its place.
I’ll tweak the wording in the OP to be clearer about this. [EDITED to add:] Now done.
Well, this applies generally to all these models—why should it look like an exponential or a power law at all to begin with? These are simplifications that are born out of the fact that we can write out these very simple ODE models that reasonably approximate the dynamics and produce meaningful trajectories.
However I think “sigmoid” is definitely the most likely pattern, if we broaden that term to mean not strictly just the logistic function (which is the solution of y′=y(1−y) ) but also any other kind of similar function that has an S shape, possibly not even symmetric. “Running into a wall” is much more unphysical—it implies a discontinuity in the derivative that real processes never exhibit.
Also you could see it as this: all these are special cases of a more general y′=P(y), where P is any polynomial. And that means virtually any analytical function, since those can be Taylor-expanded into polynomials reaching arbitrary accuracy in the neighbourhood of a specific point. So really the only assumptions baked in there are:
the rate of growth is analytical (no weird discontinuities or jumps; reasonable)
the rate of growth does not feature an explicit time dependence (also sensible, as these phenomena should happen equally regardless of which year they were kickstarted in)
Within this framework, the exponential growth is the result of a first order expansion, and the logistic is a second order expansion (under certain conditions for the coefficients). Higher orders, if present, could give rise to more complex models, but generally speaking as far as I can tell they’ll all tend to either converge to a given value (a root of the polynomial) or diverge to infinity. It would be interesting to consider the conditions under which convergence occurs I guess; it should depend on the spectrum of the polynomial but it might have a more physical interpretation.