Particularly I wonder how you would shed all that heat in a vacuum? I keep seeing arguments that the cold of space would make data centers easier to cool, but this totally contradicts my observation that all the spacecraft I’ve read about have dedicated systems to shed excess heat produced by electronic systems, as opposed to, say, having dedicated heating systems for life support and never having to worry about electronic systems generating excess heat because the cold of space just takes care of it? E.g. Apollo as I understand it mostly relied on the waste heat of systems dedicated to other functions, the Apollo 13 crew only had to worry about cold because they eliminated virtually all sources of waste heat simultaneously to conserve electricity. I think Apollo’s only dedicated systems for cabin temperature control were actually cooling systems. And this totally wouldn’t apply in the data center scenario.
Yes, I’ve worked on spacecraft and you have to put about as much effort into dissipating your energy as collecting it. The benefits I’ve heard claimed for space based compute are minor at best, and the economic downsides of cooling are show-stopping before even considering the (many) other negatives. Electronics also have to run much hotter than otherwise if your only cooling method is radiation. I’m honestly struggling to understand why some people that I normally consider reasonable and realistic are giving it much consideration.
It’s not that complex in principle: you use really big radiators.
If you look at https://www.starcloud.com/’s front page video, you see exactly that. What might look like just a big solar array is actually also a big radiator.
AFAICT it’s one of those things that works in principle but not in practice. In theory you can make really cheap space solar and radiator arrays, and with full reuse launch can approach the cost of propellant. In practice, we’re not even close to that, and any short term bet on it is just going to fail.
Skimming Starcloud’s whitepaper: They say the radiators are “held at 20 C” without mentioning at all how they’d actually do that. For effective heat distribution you need fluid and small tubes, which are very prone to micrometeorite impacts, which means you need a lot more shielding mass. If you don’t have effective heat distribution your chips get too hot.
We have lots of examples of radiators in space (because it’s approximately the only thing that works), and AFAIK micrometeor impacts haven’t been a dealbreaker when you slightly overprovision capacity and have structural redundancy. I don’t expect you’d want to spend too much on shielding, personally.
Not trying to claim Starcloud has a fully coherent plan, ofc.
Particularly I wonder how you would shed all that heat in a vacuum? I keep seeing arguments that the cold of space would make data centers easier to cool, but this totally contradicts my observation that all the spacecraft I’ve read about have dedicated systems to shed excess heat produced by electronic systems, as opposed to, say, having dedicated heating systems for life support and never having to worry about electronic systems generating excess heat because the cold of space just takes care of it? E.g. Apollo as I understand it mostly relied on the waste heat of systems dedicated to other functions, the Apollo 13 crew only had to worry about cold because they eliminated virtually all sources of waste heat simultaneously to conserve electricity. I think Apollo’s only dedicated systems for cabin temperature control were actually cooling systems. And this totally wouldn’t apply in the data center scenario.
Yes, I’ve worked on spacecraft and you have to put about as much effort into dissipating your energy as collecting it. The benefits I’ve heard claimed for space based compute are minor at best, and the economic downsides of cooling are show-stopping before even considering the (many) other negatives. Electronics also have to run much hotter than otherwise if your only cooling method is radiation. I’m honestly struggling to understand why some people that I normally consider reasonable and realistic are giving it much consideration.
It’s not that complex in principle: you use really big radiators.
If you look at https://www.starcloud.com/’s front page video, you see exactly that. What might look like just a big solar array is actually also a big radiator.
AFAICT it’s one of those things that works in principle but not in practice. In theory you can make really cheap space solar and radiator arrays, and with full reuse launch can approach the cost of propellant. In practice, we’re not even close to that, and any short term bet on it is just going to fail.
Skimming Starcloud’s whitepaper: They say the radiators are “held at 20 C” without mentioning at all how they’d actually do that. For effective heat distribution you need fluid and small tubes, which are very prone to micrometeorite impacts, which means you need a lot more shielding mass. If you don’t have effective heat distribution your chips get too hot.
We have lots of examples of radiators in space (because it’s approximately the only thing that works), and AFAIK micrometeor impacts haven’t been a dealbreaker when you slightly overprovision capacity and have structural redundancy. I don’t expect you’d want to spend too much on shielding, personally.
Not trying to claim Starcloud has a fully coherent plan, ofc.