algorithmic progress is currently outpacing compute growth by quite a bit
This is not right, at least in computer vision.
They seem to be the same order of magnitude.
Physical compute has growth at 0.6 OOM/year and physical compute requirements have decreased at 0.1 to 1.0 OOM/year, see a summary here or a in depth investigation here
Another relevant quote
Algorithmic progress explains roughly 45% of performance improvements in image classification, and most of this occurs through improving compute-efficiency.
Davidson’s takeoff model illustrates this point, where a “software singularity” happens for some parameter settings due to software not being restrained to the same degree by capital inputs.
I would point out however that our current understanding of how software progress happens is somewhat poor. Experimentation is definitely a big component of software progress, and it is often understated in LW.
This is not right, at least in computer vision. They seem to be the same order of magnitude.
Physical compute has growth at 0.6 OOM/year and physical compute requirements have decreased at 0.1 to 1.0 OOM/year, see a summary here or a in depth investigation here
Another relevant quote
Cool, makes sense. Sounds like I remembered the upper bound for the algorithmic efficiency estimate. Thanks for correcting!
Algorithmic improvement has more FOOM potential. Hardware always has a lag.
That is to very basic approximation correct.
Davidson’s takeoff model illustrates this point, where a “software singularity” happens for some parameter settings due to software not being restrained to the same degree by capital inputs.
I would point out however that our current understanding of how software progress happens is somewhat poor. Experimentation is definitely a big component of software progress, and it is often understated in LW.
More research on this soon!