So in evaluating that, the key question here is whether LLMs were on the critical path already.
Is it more like...
We’re going to get AGI at some point and we might or might not have gotten LLMs before that.
or
It was basically invertible that we get LLMs before AGI. LLMs “always” come X years ahead of AGI.
or
It was basically inevitable that we get LLMs before AGI, but there’s a big range of when they can arrive relative to AGI.
And OpenAI made the gap between LLMs and AGI bigger than the counterfactual.
or
And OpenAI made the gap between LLMs and AGI smaller than the counterfactual.
My guess is that the true answer is closest to the second option: LLMs happen a predictable-ish period ahead of AGI, in large part because they’re impressive enough and generally practical enough to drive AGI development.
So in evaluating that, the key question here is whether LLMs were on the critical path already.
Is it more like...
We’re going to get AGI at some point and we might or might not have gotten LLMs before that.
or
It was basically invertible that we get LLMs before AGI. LLMs “always” come X years ahead of AGI.
or
It was basically inevitable that we get LLMs before AGI, but there’s a big range of when they can arrive relative to AGI.
And OpenAI made the gap between LLMs and AGI bigger than the counterfactual.
or
And OpenAI made the gap between LLMs and AGI smaller than the counterfactual.
My guess is that the true answer is closest to the second option: LLMs happen a predictable-ish period ahead of AGI, in large part because they’re impressive enough and generally practical enough to drive AGI development.