Yeah I think that’s another reasonable way to update on timelines. Here you are anchoring biological scaling laws on artificial scaling laws, rather than anchoring artificial parameters on biological parameters and leaving the scaling laws as a free variable (as done by the existing model).
One major counterargument would be “biological learning algorithms are better than artificial ones and can learn faster and so have better scaling laws”.
Separately, you can get some a priori support for “human brain is undertrained relative to our optimal compute law” if you think that, for evolution, scaling up data by 2x is a higher cost than scaling up brain size by 2x. (For neural networks these are both equally costly, if you look only at training compute.) This seems pretty plausible—having twice as long a childhood can make it way more likely that you die before you ever reproduce, while having twice the brain size imposes higher metabolic costs, and plausibly the former is a lot more costly on the margin.
Yeah I think that’s another reasonable way to update on timelines. Here you are anchoring biological scaling laws on artificial scaling laws, rather than anchoring artificial parameters on biological parameters and leaving the scaling laws as a free variable (as done by the existing model).
One major counterargument would be “biological learning algorithms are better than artificial ones and can learn faster and so have better scaling laws”.
Separately, you can get some a priori support for “human brain is undertrained relative to our optimal compute law” if you think that, for evolution, scaling up data by 2x is a higher cost than scaling up brain size by 2x. (For neural networks these are both equally costly, if you look only at training compute.) This seems pretty plausible—having twice as long a childhood can make it way more likely that you die before you ever reproduce, while having twice the brain size imposes higher metabolic costs, and plausibly the former is a lot more costly on the margin.