The assumed superintelligence can take what it wants to take, and if people could “produce more with $77 of sunlight, than a superintelligence can produce with $77 of sunlight”, then it could probably force people to produce it.
I was with you until this sentence. This does not follow.
Let’s suppose “$77 worth of sunlight” has a consistent, agreed upon meaning. Maybe “Enough sunlight to generate $77 worth of electricity (at current production cost of $0.04/kWh) with current human-made solar panels over their 25 yr lifespan.” This is a little less than what falls on an average plot of land on the order of 20cmx20cm. The superintelligence could hire humans to build the solar panels, or use the electricity to run human-made equipment, or farm the plot to grow about 40g of corn.
What can a superintelligence do with that sunlight? Well, it can develop highly optimized 20-junction solar panels using advanced robotic facilities that can then generate 3x as much electricity. Maybe it has space-based manufacturing so it can use space-based solar and get 10-15x more electricity. It can use the electricity to run superintelligently-designed equipment with greater efficiency and higher quality output than human minds and hands can invent, build, and operate. These options themselves include things like building indoor robot-operated farms that optimally distribute light/water/heat/nutrients to generate >10x more crops per unit area (>40x more with the aforementioned space-based production facilities), or directly chemically synthesize specific desired molecules.
In other words: to have humans produce what a superintelligence could produce from $77 of sunlight, it would cost the superintelligence many times more than $77, and the output quality would be much lower.
Right; my point was just that the hypothetical superintelligence does not need to trade with humans if it can force them; therefore trade-related arguments are not relevant. However, it is of course likely that such a superintelligence would neither want to trade nor care enough about the production of humans to force them to do anything.
Ok, then I agree. As written, it read to me like you were closing by suggesting the AI would want to go for the “conqueror takes both” option instead of the “give the natives smallpox and drive them from their ancestral homes while committing genocide” option.
I was with you until this sentence. This does not follow.
Let’s suppose “$77 worth of sunlight” has a consistent, agreed upon meaning. Maybe “Enough sunlight to generate $77 worth of electricity (at current production cost of $0.04/kWh) with current human-made solar panels over their 25 yr lifespan.” This is a little less than what falls on an average plot of land on the order of 20cmx20cm. The superintelligence could hire humans to build the solar panels, or use the electricity to run human-made equipment, or farm the plot to grow about 40g of corn.
What can a superintelligence do with that sunlight? Well, it can develop highly optimized 20-junction solar panels using advanced robotic facilities that can then generate 3x as much electricity. Maybe it has space-based manufacturing so it can use space-based solar and get 10-15x more electricity. It can use the electricity to run superintelligently-designed equipment with greater efficiency and higher quality output than human minds and hands can invent, build, and operate. These options themselves include things like building indoor robot-operated farms that optimally distribute light/water/heat/nutrients to generate >10x more crops per unit area (>40x more with the aforementioned space-based production facilities), or directly chemically synthesize specific desired molecules.
In other words: to have humans produce what a superintelligence could produce from $77 of sunlight, it would cost the superintelligence many times more than $77, and the output quality would be much lower.
Right; my point was just that the hypothetical superintelligence does not need to trade with humans if it can force them; therefore trade-related arguments are not relevant. However, it is of course likely that such a superintelligence would neither want to trade nor care enough about the production of humans to force them to do anything.
Ok, then I agree. As written, it read to me like you were closing by suggesting the AI would want to go for the “conqueror takes both” option instead of the “give the natives smallpox and drive them from their ancestral homes while committing genocide” option.