assume the AGI will have the capacity to kill us at very low cost/risk
This assumption comes at a high cost in probability mass. The difficulty of “killing off humanity” type tasks will increase exponentially as AI leads to AGI leads to super-AGI; its a moving target.
humans make very inefficient use of resources (including what we need to stay alive)
Largely irrelevant: humans use an infinitesimal fraction of solar resources. Moreover, (bacteria, insects, rats) make very inefficient use of our resources as well, why haven’t we killed them off?
most AGI goals are improved by more resources
The bronze age did not end for lack of steam, nor the coal age for lack of coal. Evolution appears to move forward by using less resources rather than more.
most AGI goals are not human friendly
Who cares? Most AGI goals will never be realized.
Hence most AGIs will make better use of resources by controlling them than by trading with humans.
True, the question is: what resources?
Hence most AGIs will kill us by default.
Most random home brain surgical operations will kill us by default as well.
This assumption comes at a high cost in probability mass. The difficulty of “killing off humanity” type tasks will increase exponentially as AI leads to AGI leads to super-AGI; its a moving target.
Largely irrelevant: humans use an infinitesimal fraction of solar resources. Moreover, (bacteria, insects, rats) make very inefficient use of our resources as well, why haven’t we killed them off?
The bronze age did not end for lack of steam, nor the coal age for lack of coal. Evolution appears to move forward by using less resources rather than more.
Who cares? Most AGI goals will never be realized.
True, the question is: what resources?
Most random home brain surgical operations will kill us by default as well.