Google went from processing ~500 trillion tokens per month in May to ~1 quadrillion tokens per month in July.
It’s uncertain how many of these are output tokens. Based on OpenRouter’s numbers, it seems like a plausible estimate is 10%, or 100 trillion output tokens per month (back in July). This is 200 TB per month, or 7 TB per day. I’m not sure how many data centers they have—if there’s 10 data centers, then it would be 700GB per day.
It has been 8 months since then, so a naive extrapolation suggests this is now ~16x larger.
For model weight sizes, Grok-3 and Grok-4 were 3 trillion parameters per Elon. This would be ~1.5-3TB, depending on if they use 4 or 8 bit datatypes. Grok-5 will be 6 trillion parameters, also per Elon.
Anthropic’s Feb 2026 risk report discussed securing “multi-terabyte model weights”.
So a reasonable estimate is 1-10TB of output tokens per datacenter per day, and 1-10TB model weight size.
My estimates on text traffic and model weight sizes:
Text traffic
OpenAI was doing ~8.64 trillion tokens per day in October 2025 on the API (not including ChatGPT)
Google went from processing ~500 trillion tokens per month in May to ~1 quadrillion tokens per month in July.
It’s uncertain how many of these are output tokens. Based on OpenRouter’s numbers, it seems like a plausible estimate is 10%, or 100 trillion output tokens per month (back in July). This is 200 TB per month, or 7 TB per day. I’m not sure how many data centers they have—if there’s 10 data centers, then it would be 700GB per day.
It has been 8 months since then, so a naive extrapolation suggests this is now ~16x larger.
For model weight sizes, Grok-3 and Grok-4 were 3 trillion parameters per Elon. This would be ~1.5-3TB, depending on if they use 4 or 8 bit datatypes. Grok-5 will be 6 trillion parameters, also per Elon.
Anthropic’s Feb 2026 risk report discussed securing “multi-terabyte model weights”.
So a reasonable estimate is 1-10TB of output tokens per datacenter per day, and 1-10TB model weight size.
Very helpful data points and BOTEC. I largely agree with your estimates. Thank you