I think that is just true. Now in hindsight, my mistake is that I haven’t really updated sufficiently towards how the major players are shifting towards their own chip design capacity. (Apple comes to mind but I am definitely caught a bit off guard on how even Meta and Amazon had moved forward.) I had the impression that Amazon had a bad time in their previous generation of chips—and that new generation of their chips is focused on inference anyways.
But now with the blending of inference and training regime, maybe the “intermediaries” like Nvidia now gets to capture less and less of upside. And it seems more and more likely to me that we are having a moment of “going back to the basics” of looking at the base ingredients—the compute and the electricity.
It seems like the big players already have plans to cut Nvidia out of the loop though.
OpenAI
Amazon/Anthropic (Trainium)
Meta (MTAI)
Google (TPU)
And while they seem to have the best general purpose hardware, they’re limited by competition with AMD, Apple, and Qualcomm.
I think that is just true. Now in hindsight, my mistake is that I haven’t really updated sufficiently towards how the major players are shifting towards their own chip design capacity. (Apple comes to mind but I am definitely caught a bit off guard on how even Meta and Amazon had moved forward.) I had the impression that Amazon had a bad time in their previous generation of chips—and that new generation of their chips is focused on inference anyways.
But now with the blending of inference and training regime, maybe the “intermediaries” like Nvidia now gets to capture less and less of upside. And it seems more and more likely to me that we are having a moment of “going back to the basics” of looking at the base ingredients—the compute and the electricity.