One of the interesting observations in computing is that Moore’s law of processing power is almost as much a Moore’s law of energy efficiency. This makes sense since ultimately you have to deal with the waste heat, so if energy consumption (and hence heat production) were not halving roughly every turn of Moore’s law, quickly you’d wind up in a situation where you simply cannot run your faster hotter new chips.
This leads to Ozkural’s projection that increasing (GPU) energy efficiency is the real limit on any widespread economical use of AI, and given past improvements, we’ll have the hardware capability to run cost-effective neuromorphic AI by 2026 and then the wait is just software based...
One of the interesting observations in computing is that Moore’s law of processing power is almost as much a Moore’s law of energy efficiency. This makes sense since ultimately you have to deal with the waste heat, so if energy consumption (and hence heat production) were not halving roughly every turn of Moore’s law, quickly you’d wind up in a situation where you simply cannot run your faster hotter new chips.
This leads to Ozkural’s projection that increasing (GPU) energy efficiency is the real limit on any widespread economical use of AI, and given past improvements, we’ll have the hardware capability to run cost-effective neuromorphic AI by 2026 and then the wait is just software based...