I see some value in the framing of “general intelligence” as a binary property, but it also doesn’t quite feel as though it fully captures the phenomenon. Like, it would seem rather strange to describe GPT4 as being a 0 on the general intelligence scale.
I think maybe a better analogy would be to consider the sum of a geometric sequence.
Consider the sum for a few values of r as it increases at a steady rate.
What we see then is quite significant returns to increases in r and then a sudden divergence.
(Aside: This model feels related to that of nuclear chain reactions in that you can model the total production of reactions as a geometric sequence, however, this model doesn’t just have sub-criticality and super-criticality, but criticality. And I’m not sure how you’d fit criticality in here).
In contrast, many economists want to model AI as a more traditional exponentially increasing system (ie. rn).
I see some value in the framing of “general intelligence” as a binary property, but it also doesn’t quite feel as though it fully captures the phenomenon. Like, it would seem rather strange to describe GPT4 as being a 0 on the general intelligence scale.
I think maybe a better analogy would be to consider the sum of a geometric sequence.
Consider the sum for a few values of r as it increases at a steady rate.
0.5 − 2a
0.6 − 2.5a
0.7 − 3.3a
0.8 − 5a
0.9 − 10a
1 - Diverges to infinity
What we see then is quite significant returns to increases in r and then a sudden divergence.
(Aside: This model feels related to that of nuclear chain reactions in that you can model the total production of reactions as a geometric sequence, however, this model doesn’t just have sub-criticality and super-criticality, but criticality. And I’m not sure how you’d fit criticality in here).
In contrast, many economists want to model AI as a more traditional exponentially increasing system (ie. rn).