OpenAI estimated that the energy consumption for training GPT-3 was about 3.14 x 10^17 Joules.
sanity checking this figure: 1 kWh is 1000 x 60 x 60 = 3.6 MJ. then GPT-3 consumed 8.7 x 10^10 kWh. at a very conservative $0.04/kWh, that’s $3.5B just in the power bill — disregarding all the non-power costs (i.e. the overheads of operating a datacenter).
i could believe this number’s within 3 orders of magnitude of truth, which is probably good enough for the point of this article, but i am a little surprised if you just took it 100% at face value.
i could believe this number’s within 3 orders of magnitude of truth, which is probably good enough for the point of this article
It’s not. As best I can tell it’s off by more like 4+ OOM. A very quick search suggests actual usage was maybe more like 1 GWh. Back of the envelope guess: thousands of GPUs, thousands of hours, < 1kW/GPU, a few GWh.
sanity checking this figure: 1 kWh is 1000 x 60 x 60 = 3.6 MJ. then GPT-3 consumed 8.7 x 10^10 kWh. at a very conservative $0.04/kWh, that’s $3.5B just in the power bill — disregarding all the non-power costs (i.e. the overheads of operating a datacenter).
i could believe this number’s within 3 orders of magnitude of truth, which is probably good enough for the point of this article, but i am a little surprised if you just took it 100% at face value.
It’s not. As best I can tell it’s off by more like 4+ OOM. A very quick search suggests actual usage was maybe more like 1 GWh. Back of the envelope guess: thousands of GPUs, thousands of hours, < 1kW/GPU, a few GWh.
https://www.theregister.com/2020/11/04/gpt3_carbon_footprint_estimate/
https://www.numenta.com/blog/2022/05/24/ai-is-harming-our-planet/
Same. That doesn’t seem to rise to the quality standards I’d expect.
Yeah, I should have double-checked.
Editing post to reflect the correct values. Does not affect the “two decades” bottom line conclusion.