Depending on OpenAI growth, this is more of a soft upper bound on what gets built.
I’m confused: the announcement indicates that the $400B has been committed, and is not dependent on OpenAI’s growth (although perhaps you’re implying that there’s no way they actually spend the $400B unless OpenAI revenue continues to rapidly grow)?
Also, why would this $400B / 7GW be an upper bound? A recent WSJ article suggests they are planning to surpass that, although details are super light.
As a naive follow-up: let’s say GPT-6 could be trained in 3 months on a 3GW cluster. Could I instead train it in 9 months on a 1GW cluster?