Interesting read, however I am a bit surprised by how you treat power, with US at 600GW and China 5* more. Similar things are often quoted in mainstream media and I think they are missing the point. Power seems to be relevant only in terms of supplying AI compute, and possibly robotics, and only then IF it is a constraint.
However to be basic calc show it should not be. For example say in 2030 we get a large compute increase with 50 million H100 equivalent per year produced, up from ~3 million eq in 2025. This would require ~1KW extra each at say ~50GW total including infrastructure.
Now this may seem like a lot, but if we compute the cost per GPU, then if a chip requiring 1KW costs $20K, then the costs to power it with solar/battery are far less. Lets say the solar/data center are in Texas with a solar capacity factor of 20%. To power it almost 24⁄7 from solar and batteries requires about 5KW of panels, and say 18kWh of batteries. The average prices of solar panels are <10c per watt, so just $500 for the panels. At scale, batteries are heading below $200 per kWh so this is $3600. This is a lot less than the cost of the chip. Solar panels and batteries are commodities so even if China does produce more than USA, it cannot stop them from being used by anyone worldwide.
Power consumption is only relevant if it is the limiting factor in building data centers—the installed capacities of large countries don’t apply. Having an existing large capacity is a potential advantage, but only if the opposing country can’t build their data center because this stops them.
I also strongly expect branch 1, where the new algorithm is a lot more power efficient suddenly anyway.
Economically, I agree with your calculations on power. However, the US has worked itself into a corner where we frequently tie our own hands, refusing to let the people who want to build the energy production facilities do so where they’re needed on anything like a reasonable timescale.
Interesting read, however I am a bit surprised by how you treat power, with US at 600GW and China 5* more. Similar things are often quoted in mainstream media and I think they are missing the point. Power seems to be relevant only in terms of supplying AI compute, and possibly robotics, and only then IF it is a constraint.
However to be basic calc show it should not be. For example say in 2030 we get a large compute increase with 50 million H100 equivalent per year produced, up from ~3 million eq in 2025. This would require ~1KW extra each at say ~50GW total including infrastructure.
Now this may seem like a lot, but if we compute the cost per GPU, then if a chip requiring 1KW costs $20K, then the costs to power it with solar/battery are far less. Lets say the solar/data center are in Texas with a solar capacity factor of 20%. To power it almost 24⁄7 from solar and batteries requires about 5KW of panels, and say 18kWh of batteries. The average prices of solar panels are <10c per watt, so just $500 for the panels. At scale, batteries are heading below $200 per kWh so this is $3600. This is a lot less than the cost of the chip. Solar panels and batteries are commodities so even if China does produce more than USA, it cannot stop them from being used by anyone worldwide.
Power consumption is only relevant if it is the limiting factor in building data centers—the installed capacities of large countries don’t apply. Having an existing large capacity is a potential advantage, but only if the opposing country can’t build their data center because this stops them.
I also strongly expect branch 1, where the new algorithm is a lot more power efficient suddenly anyway.
Economically, I agree with your calculations on power. However, the US has worked itself into a corner where we frequently tie our own hands, refusing to let the people who want to build the energy production facilities do so where they’re needed on anything like a reasonable timescale.