I buy this for the post-GPT-3.5 era. What’s confusing me is that the rate of advancement in the pre-GPT-3.5 era was apparently the same as in the post-GPT-3.5 era, i. e., doubling every 7 months.
Why would we expect there to be no distribution shift once the AI race kicked into high gear? GPT-2 to GPT-3 to GPT-3.5 proceeded at a snail’s pace by modern standards. How did the world happen to invest in them just enough for them to fit into the same trend?
Actually, progress in 2024 is roughly 2x faster than earlier progress which seems consistent with thinking there is some distribution shift. It’s just that this distribution shift didn’t kick in until we had Anthropic competing with OpenAI and reasoning models. (Note that OpenAI didn’t release a notably better model than GPT-4-1106 until o1-preview!)
I buy this for the post-GPT-3.5 era. What’s confusing me is that the rate of advancement in the pre-GPT-3.5 era was apparently the same as in the post-GPT-3.5 era, i. e., doubling every 7 months.
Why would we expect there to be no distribution shift once the AI race kicked into high gear? GPT-2 to GPT-3 to GPT-3.5 proceeded at a snail’s pace by modern standards. How did the world happen to invest in them just enough for them to fit into the same trend?
Actually, progress in 2024 is roughly 2x faster than earlier progress which seems consistent with thinking there is some distribution shift. It’s just that this distribution shift didn’t kick in until we had Anthropic competing with OpenAI and reasoning models. (Note that OpenAI didn’t release a notably better model than GPT-4-1106 until o1-preview!)