Not an expert, but my understanding (from reading Heinlein, and I think other sources) is that it’s hard to dissipate heat in space, because there’s nothing to conduct it away.
In space you can only dissipate heat by radiation. In an atmosphere you can also transfer excess heat into matter that you can carry away and dump elsewhere, using conduction, convection, and forced circulation.
For a concrete example, consider Google’s average 2011 power consumption of 258MW. What happens if they do all that in a huge server farm in space? Assume the exterior is a perfect black body and the interior is a perfect thermal conductor.
From the Stefan-Boltzmann law, for the equilibrium surface temperature to be at the boiling point of water, the surface area must be 235000 sq.m., or the area of a sphere of diameter 387m. Alternatively, if it was a large flat shape, edge-on to the Sun, it could be a 350m square.
Increasing the surface area with lots of large parallel fins, like on a heatsink, only works when immersed in a thermally conductive circulating medium. In space, the fins just block each others’ view, and the effective radiative area is that of the convex hull.
But we could change that slightly (?): what about process that produces an enormous amount of radiation? In space you can just dump those pesky photons in the backyard, while on Earth there’s always someone that owns this or that piece of land.
The point we are missing is means of dissipation. In space it’s impossible to dissipate through convection, while it’s very easy to dissipate through radiation.
A space station would roast star-side and freeze in the opposite direction.
Huge computing facility! It’s easier to dissipate heat in space.
Not an expert, but my understanding (from reading Heinlein, and I think other sources) is that it’s hard to dissipate heat in space, because there’s nothing to conduct it away.
But isn’t space like a heat bath at −273° C? I think there’s a finer point one or both of us is missing.
Cooling is much easier on the ground.
In space you can only dissipate heat by radiation. In an atmosphere you can also transfer excess heat into matter that you can carry away and dump elsewhere, using conduction, convection, and forced circulation.
For a concrete example, consider Google’s average 2011 power consumption of 258MW. What happens if they do all that in a huge server farm in space? Assume the exterior is a perfect black body and the interior is a perfect thermal conductor.
From the Stefan-Boltzmann law, for the equilibrium surface temperature to be at the boiling point of water, the surface area must be 235000 sq.m., or the area of a sphere of diameter 387m. Alternatively, if it was a large flat shape, edge-on to the Sun, it could be a 350m square.
Increasing the surface area with lots of large parallel fins, like on a heatsink, only works when immersed in a thermally conductive circulating medium. In space, the fins just block each others’ view, and the effective radiative area is that of the convex hull.
Bummer!
But we could change that slightly (?): what about process that produces an enormous amount of radiation? In space you can just dump those pesky photons in the backyard, while on Earth there’s always someone that owns this or that piece of land.
High-intensity computing generates waste heat, though. You can’t turn waste heat into directed energy within the laws of thermodynamics.
The point we are missing is means of dissipation. In space it’s impossible to dissipate through convection, while it’s very easy to dissipate through radiation.
A space station would roast star-side and freeze in the opposite direction.