Given all the computation it would be carrying out, wouldn’t an AGI be extremely resource-intensive? Something relatively simple like bitcoin mining (simple when compared to the sort of intellectual/engineering feats that AGIs are supposed to be capable of) famously uses up more energy than some industrialized nations.
Medium answer: If AGI has components that look like our most capable modern deep learning models (which I think is quite likely if it arrives in the next decade or two), it will probably be very resource-intensive to run, and orders of magnitude more expensive to train. This is relevant because it impacts who has the resources to develop AGI (large companies and governments; likely not individual actors), secrecy (it’s more difficult to secretly acquire a massive amount of compute than it is to secretly boot up an AGI on your laptop; this may even enable monitoring and regulation), and development speed (if iterations are slower and more expensive, it slows down development).
If you’re interested in further discussion of possible compute costs for AGI (and how this affects timelines), I recommend reading about bio anchors.
(I’m not sure but why would this be important? Sorry for the silly answer, feel free to reply in the anonymous form again)
I think a good baseline for comparison would be
Training large ML models (expensive)
Running trained ML models (much cheaper)
I think comparing to blockchain is wrong, because
it was explicitly designed to be resource intensive on purpose (this adds to the security of proof-of-work blockchains)
there is a financial incentive to use a specific (very high) amount of resources on blockchain mining (because what you get is literally a currency, and this currency has a certain value, so it’s worthwhile to spend any money lower than that value on the mining process)
None of these are true for ML/AI, where your incentive is more something like “do useful things”
Anonymous question (ask here) :
Given all the computation it would be carrying out, wouldn’t an AGI be extremely resource-intensive? Something relatively simple like bitcoin mining (simple when compared to the sort of intellectual/engineering feats that AGIs are supposed to be capable of) famously uses up more energy than some industrialized nations.
Short answer: Yep, probably.
Medium answer: If AGI has components that look like our most capable modern deep learning models (which I think is quite likely if it arrives in the next decade or two), it will probably be very resource-intensive to run, and orders of magnitude more expensive to train. This is relevant because it impacts who has the resources to develop AGI (large companies and governments; likely not individual actors), secrecy (it’s more difficult to secretly acquire a massive amount of compute than it is to secretly boot up an AGI on your laptop; this may even enable monitoring and regulation), and development speed (if iterations are slower and more expensive, it slows down development).
If you’re interested in further discussion of possible compute costs for AGI (and how this affects timelines), I recommend reading about bio anchors.
(I’m not sure but why would this be important? Sorry for the silly answer, feel free to reply in the anonymous form again)
I think a good baseline for comparison would be
Training large ML models (expensive)
Running trained ML models (much cheaper)
I think comparing to blockchain is wrong, because
it was explicitly designed to be resource intensive on purpose (this adds to the security of proof-of-work blockchains)
there is a financial incentive to use a specific (very high) amount of resources on blockchain mining (because what you get is literally a currency, and this currency has a certain value, so it’s worthwhile to spend any money lower than that value on the mining process)
None of these are true for ML/AI, where your incentive is more something like “do useful things”