No—AI is just as energy-efficient as your brain.

This is simply because AIs run on electricity.

Our brains use sunlight via photosynthesis via dietary energy intake. From sunlight to dietary energy this is about 0.25-0.5% energy efficient. Let’s say it’s 0.35% efficient. (This is then our complete sunlight-to-capability efficiency for a human)

AI systems use sunlight via solar energy via electrical consumption. From sunlight to GPU this is about 10-18% efficient let’s say it’s 13% efficient. (Data: typical solar panel efficiency = 15-20% efficient. Typical electricity distribution efficiency = 90% efficient). 13% is then AI sunlight-to-electricity efficiency, but not yet sunlight-to-capability efficiency.

Let’s calculate the final part. We need to assume an amount of compute for a human-equivalent AI system. I will assume that (note, at inference time) we can run a human-equivalent AI system on 10 NVIDIA 4070 GPUs at full power, which each consume ~200 Watts, so 2 kW in total. In contrast the typical human consumes 100 watts. 100W divided by 2 kW gives us 5% as AI electricity-to-capability efficiency, which we then multiply to get the total AI efficiency. 5% electricity-to-capability efficiency times 13% sunlight-to-electricity efficiency = 0.65% sunlight-to-capability efficiency.

So humans are 0.35% efficient, and AIs are 0.65% efficient.

By these assumptions, AI is somewhat more efficient in terms of real energy input—here, energy from the sun. This is the number that the economy is going to care about—why use land for crops, when you can use it for solar panels?

Please pick apart my numbers and assumptions in the comments. Thanks.