I agree with you that Amdahl’s law applies; this claim isn’t meant to be read as “it’s possible to make programs faster indefinitely by getting more processors”. Two points:
The final value of r that I estimate for AI is only one to two orders of magnitude above the value that Roodman estimates for human civilization. This is much faster, of course, but the fact that it’s not much larger suggests there are indeed obstacles to making models run arbitrarily fast that aren’t too far off from the timescales at which we grow.
I think it’s worth pointing out that we can make architectures more parallelizable and have done so in the past. RNNs were abandoned both because of the quadratic scaling with hidden state dimension but also because of backprop through them not being a parallelizable computation. It seems like when we run into Amdahl’s law style constraints, we can get around them to a substantial extent by replacing our architectures with ones that are more parallelizable, such as by going from RNNs to Transformers.
I agree with you that Amdahl’s law applies; this claim isn’t meant to be read as “it’s possible to make programs faster indefinitely by getting more processors”. Two points:
The final value of r that I estimate for AI is only one to two orders of magnitude above the value that Roodman estimates for human civilization. This is much faster, of course, but the fact that it’s not much larger suggests there are indeed obstacles to making models run arbitrarily fast that aren’t too far off from the timescales at which we grow.
I think it’s worth pointing out that we can make architectures more parallelizable and have done so in the past. RNNs were abandoned both because of the quadratic scaling with hidden state dimension but also because of backprop through them not being a parallelizable computation. It seems like when we run into Amdahl’s law style constraints, we can get around them to a substantial extent by replacing our architectures with ones that are more parallelizable, such as by going from RNNs to Transformers.