I am fairly confident That GPT-5.1, which I’m confident is a check-out of GPT-4o, has more than 60% of its training flops in post-training.
If openai created another GPT-4 pre train, they’d post-train it all over again.
Of course they’ll do it. But just not that often. Likely once a year or something like that.
I am fairly confident That GPT-5.1, which I’m confident is a check-out of GPT-4o, has more than 60% of its training flops in post-training.
If openai created another GPT-4 pre train, they’d post-train it all over again.
Of course they’ll do it. But just not that often. Likely once a year or something like that.