In AI, I think we’ve seen perhaps 2 massively trend-breaking breakthroughs in the last 20 or so years: deep learning at substantial scale (starting with AlexNet) and (maybe?) scaling up generative pretraining (starting with GPT-1).[1] Scaling up RL and reasoning models probably caused somewhat above trend progress (in 2025), but I don’t think this constitutes a massive trend break.)
Somewhat implied but worth noting, both of these trend breaks are not principally algorithmic but hardware-related.
AlexNet: Hey, shifting compute to GPUs let us do neural networks way better than CPUs.
Scaling up: Hey, money lets us link together several thousand GPUS for longer periods of time.
Maybe some level of evidence that future trend-breaking events might also be hardware related, which runs contrary to several projections.
Somewhat implied but worth noting, both of these trend breaks are not principally algorithmic but hardware-related.
Maybe some level of evidence that future trend-breaking events might also be hardware related, which runs contrary to several projections.