“In practice replacing digital computers with an alternative computing paradigm is a risky proposition.
Alternative computing architectures, such as parallel digital computers have not tended to be commercially viable, because Moore’s Law has consistently enabled conventional von Neumann architectures to render alternatives unnecessary.
Besides Moore’s Law, digital computing also benefits from mature tools and expertise for optimizing performance at all levels of the system: process technology, fundamental circuits, layout and algorithms.
Many engineers are simultaneously working to improve every aspect of digital technology, while alternative technologies like analog computing do not have the same kind of industry juggernaut pushing them forward.”
--Benjamin Vigoda, “Analog Logic: Continuous-Time Analog Circuits for Statistical Signal Processing” (2003 PhD thesis)
Alternative computing architectures, such as parallel digital computers have not tended to be commercially viable, because Moore’s Law has consistently enabled conventional von Neumann architectures to render alternatives unnecessary.
And the very next year, Intel abandoned its plans to make 4 GHz processors, and we’ve been stuck at around 3 GHz ever since.
Since when, parallel computing has indeed had the industry juggernaut behind it.
Yep, and that’s why we all have dual-core or more now rather than long ag. Parallel computers of various architectures have been around since at least the ’50s (mainframes had secondary processors for IO operations, IIRC), but were confined to niches until the frequency wall was hit and the juggernaut had to do something else with the transistors Moore’s law was producing.
(I also read this quote as an indictment of the Lisp machine and other language-optimized processor architectures, and more generally, as a Hansonesque warning against ‘not invented here’ thinking; almost all innovation and good ideas are ‘not invented here’ and those who forget that will be roadkill under the juggernaut.)
--Benjamin Vigoda, “Analog Logic: Continuous-Time Analog Circuits for Statistical Signal Processing” (2003 PhD thesis)
And the very next year, Intel abandoned its plans to make 4 GHz processors, and we’ve been stuck at around 3 GHz ever since.
Since when, parallel computing has indeed had the industry juggernaut behind it.
Yep, and that’s why we all have dual-core or more now rather than long ag. Parallel computers of various architectures have been around since at least the ’50s (mainframes had secondary processors for IO operations, IIRC), but were confined to niches until the frequency wall was hit and the juggernaut had to do something else with the transistors Moore’s law was producing.
(I also read this quote as an indictment of the Lisp machine and other language-optimized processor architectures, and more generally, as a Hansonesque warning against ‘not invented here’ thinking; almost all innovation and good ideas are ‘not invented here’ and those who forget that will be roadkill under the juggernaut.)