It may seem to have been a golden age of promise now lost, but I was there, and that isn’t how it seems to me.
As examples of computer science in 1985, the linked blog post cites the Lisp machine and ALICE. The Lisp machine was built. It was sold. There are no Lisp machines now, except maybe in museums or languishing as mementos. ALICE (not notable enough to get a Wikipedia article) never went beyond a hardware demo. (I knew Mike Reeve and John Darlington back then, and knew about ALICE, although I wasn’t involved with it. One of my current colleagues was, and still has an old ALICE circuit board in his office. I was involved with another alternative architecture, of which, at this remove, the less said the better.)
What killed them? Moore’s Law, and this was an observation that was made even back then. There was no point in designing special purpose hardware for better performance, because general purpose hardware would have doubled its speed before long and it would outperform you before you could ever get into production. Turning up the clock made everything faster, while specialised hardware only made a few things faster.
Processors stopped getting faster in 2004 (when Intel bottled out of making 4GHz CPUs). The result? Special-purpose hardware primarily driven not by academic research but by engineers trying to make stuff that did more within that limit: GPUs for games and server farms for the web. Another damp squid of the 1980s, the Transputer, can be seen as ancestral to those developments, but I suspect that if the Transputer had never been invented, the development of GPUs would be unaffected.
When it appears, as the blog post says, “that all you must do to turn a field upside-down is to dig out a few decades-old papers and implement the contents”, well, maybe a geek encountering the past is like a physicist encountering a new subject. OTOH, he is actually trying to do something, so props to him, and I hope he succeeds at what could not be done back then.
It may seem to have been a golden age of promise now lost, but I was there, and that isn’t how it seems to me.
As examples of computer science in 1985, the linked blog post cites the Lisp machine and ALICE. The Lisp machine was built. It was sold. There are no Lisp machines now, except maybe in museums or languishing as mementos. ALICE (not notable enough to get a Wikipedia article) never went beyond a hardware demo. (I knew Mike Reeve and John Darlington back then, and knew about ALICE, although I wasn’t involved with it. One of my current colleagues was, and still has an old ALICE circuit board in his office. I was involved with another alternative architecture, of which, at this remove, the less said the better.)
What killed them? Moore’s Law, and this was an observation that was made even back then. There was no point in designing special purpose hardware for better performance, because general purpose hardware would have doubled its speed before long and it would outperform you before you could ever get into production. Turning up the clock made everything faster, while specialised hardware only made a few things faster.
Processors stopped getting faster in 2004 (when Intel bottled out of making 4GHz CPUs). The result? Special-purpose hardware primarily driven not by academic research but by engineers trying to make stuff that did more within that limit: GPUs for games and server farms for the web. Another damp squid of the 1980s, the Transputer, can be seen as ancestral to those developments, but I suspect that if the Transputer had never been invented, the development of GPUs would be unaffected.
When it appears, as the blog post says, “that all you must do to turn a field upside-down is to dig out a few decades-old papers and implement the contents”, well, maybe a geek encountering the past is like a physicist encountering a new subject. OTOH, he is actually trying to do something, so props to him, and I hope he succeeds at what could not be done back then.