Hmm, I still don’t believe this. An AC is still ultimately hooked up to a single 240V outlet and so simply can’t consume that much power (usually maxxing out at 3000W, and almost always more like 1500W).
And ultimately the only thing that matters here is power consumption, which basically all gets converted into heat. I would be surprised if AC ends up more than 50% of power consumption, and 0.4C would still mean that electrical power consumption would be increasing ambient temperature by a full degree, which doesn’t seem realistic to me.
The cooled indoor air also makes its way outside after not very long though, so this should mostly cancel out over the course of a day, leaving just the power consumption of the AC.
And ultimately the only thing that matters here is power consumption,
Why? I think this is measuring exterior temperature, not the average of exterior and interior temperature. If cooling is set to a comfortable temperature and only run on heat wave days, then you should expect the heat wave days to also have a boost from the thermal mass of interior temperature, and there could be other indirect effects.
[Like, I would buy that power consumption dominates. But the only thing? Seems premature.]
I would be surprised if AC ends up more than 50% of power consumption
It does in Texas during heat waves (focusing only on peak demand, which seems fair). Texas is, of course, hotter than Europe (and places even hotter than Texas have even higher cooling costs).
This is what I was thinking. In a city in the summer there might be almost as much indoor space as outdoor space at ground level. The temporary change in outside temperature would then be almost as much as the reduction indoors, right?
I don’t really have a good sense nor am I doing the math for indoor versus outdoor space or how rapidly air moves through cities. I still suspect this concern is largely illusory and another justification for the cult of pain. But I do want to think about the physics correctly.
Hmm, I still don’t believe this. An AC is still ultimately hooked up to a single 240V outlet and so simply can’t consume that much power (usually maxxing out at 3000W, and almost always more like 1500W).
And ultimately the only thing that matters here is power consumption, which basically all gets converted into heat. I would be surprised if AC ends up more than 50% of power consumption, and 0.4C would still mean that electrical power consumption would be increasing ambient temperature by a full degree, which doesn’t seem realistic to me.
No, AC actually moves 2-3x as much heat as it’s input power, so a 1500W AC will extract an additional 3000W from inside and dump 4500W outside
The cooled indoor air also makes its way outside after not very long though, so this should mostly cancel out over the course of a day, leaving just the power consumption of the AC.
Why? I think this is measuring exterior temperature, not the average of exterior and interior temperature. If cooling is set to a comfortable temperature and only run on heat wave days, then you should expect the heat wave days to also have a boost from the thermal mass of interior temperature, and there could be other indirect effects.
[Like, I would buy that power consumption dominates. But the only thing? Seems premature.]
It does in Texas during heat waves (focusing only on peak demand, which seems fair). Texas is, of course, hotter than Europe (and places even hotter than Texas have even higher cooling costs).
This is what I was thinking. In a city in the summer there might be almost as much indoor space as outdoor space at ground level. The temporary change in outside temperature would then be almost as much as the reduction indoors, right?
I don’t really have a good sense nor am I doing the math for indoor versus outdoor space or how rapidly air moves through cities. I still suspect this concern is largely illusory and another justification for the cult of pain. But I do want to think about the physics correctly.