One argument is that energy costs 0.1 cents per kWh versus 5 cents on Earth. For now launch costs dominate this but in the future this balance might change.
I wish that article included any justification for that argument and I was unable to find any elsewhere by Chris Stott. Space energy today is certainly far more expensive than earth energy, probably 1000x.
Even as launch costs approach 0, things that can operate in space are more expensive than not. Vastly more difficult maintenance and cooling, radiation hardening, latency… I don’t see how even 7x cheaper energy per area (current solar is ~200 W/m^2 avg, space maximum is 1360 W/m^2) can make up for those things. (edit: this accounts for weather + night + atmosphere effects, see below comment)
In addition to hitting higher energy from a given area, you also can get the same energy 100% of the time (without issues with night or clouds). But yeah, I agree, and I don’t see how you get 50x efficiency even if transport to space (and assembly/maintenance in space) were free.
I agree that even with free launch and no maintenance costs, you still don’t get 50x. But it’s closer than it looks.
On Earth, to get reliable self-contained solar power we need batteries that cost a lot more than the solar panels. A steady 1 kW load needs on the order of 15 kW peak-rated solar panels plus around 50 kW-hr battery capacity. Even that doesn’t get 99% uptime, but enough for many purposes and it is probably adequate when connected to a continent-spanning grid with other power sources.
The same load in orbit would need about 1.5 kW peak rated panels and less than 1 kW-hr of battery capacity for uptime dependent only upon reliability of equipment. The equipment does need to be designed for space, but doesn’t need to be sturdy against wind, rain, and hailstones. It would have increased cooling costs, but transporting heat (e.g. via coolant loop) into a radiator edge-on to the Sun will be highly effective (on the order of 1000 W/m^2 for a radiator averaging 35 C).
This is what I was talking about, I should have been more clear. Year-average daily ground irradiance in the US seems to be about 4-5 kWh/m^2 (this includes night + weather + atmosphere effects, does not include panel efficiency) ~= 200 W/m^2. In space (assuming sun-synchronous or non-LEO, which are both more expensive than LEO) you get 1361 W/m^2. So about 7x. In the Sahara Desert it’s ~300 W/m^2 so ground-solar worst case is 4-5x.
One argument is that energy costs 0.1 cents per kWh versus 5 cents on Earth. For now launch costs dominate this but in the future this balance might change.
I wish that article included any justification for that argument and I was unable to find any elsewhere by Chris Stott. Space energy today is certainly far more expensive than earth energy, probably 1000x.
Even as launch costs approach 0, things that can operate in space are more expensive than not. Vastly more difficult maintenance and cooling, radiation hardening, latency… I don’t see how even 7x cheaper energy per area (current solar is ~200 W/m^2 avg, space maximum is 1360 W/m^2) can make up for those things. (edit: this accounts for weather + night + atmosphere effects, see below comment)
In addition to hitting higher energy from a given area, you also can get the same energy 100% of the time (without issues with night or clouds). But yeah, I agree, and I don’t see how you get 50x efficiency even if transport to space (and assembly/maintenance in space) were free.
I agree that even with free launch and no maintenance costs, you still don’t get 50x. But it’s closer than it looks.
On Earth, to get reliable self-contained solar power we need batteries that cost a lot more than the solar panels. A steady 1 kW load needs on the order of 15 kW peak-rated solar panels plus around 50 kW-hr battery capacity. Even that doesn’t get 99% uptime, but enough for many purposes and it is probably adequate when connected to a continent-spanning grid with other power sources.
The same load in orbit would need about 1.5 kW peak rated panels and less than 1 kW-hr of battery capacity for uptime dependent only upon reliability of equipment. The equipment does need to be designed for space, but doesn’t need to be sturdy against wind, rain, and hailstones. It would have increased cooling costs, but transporting heat (e.g. via coolant loop) into a radiator edge-on to the Sun will be highly effective (on the order of 1000 W/m^2 for a radiator averaging 35 C).
This is what I was talking about, I should have been more clear. Year-average daily ground irradiance in the US seems to be about 4-5 kWh/m^2 (this includes night + weather + atmosphere effects, does not include panel efficiency) ~= 200 W/m^2. In space (assuming sun-synchronous or non-LEO, which are both more expensive than LEO) you get 1361 W/m^2. So about 7x. In the Sahara Desert it’s ~300 W/m^2 so ground-solar worst case is 4-5x.