I’m not quite sure software is well short of the performance of biological systems in terms of what software can do with given number of operations per second. Consider the cat image recognition: Google’s system has miniscule computing power comparing to human visual cortex, and performs accordingly (badly).
What I suspect though, is that the greatest advances in speeding up technological progress, would come from better algorithm that works on well defined problems like making better transistors—something where even the humans make breakthroughs not by verbally doing some i think therefore i am philosophy in their heads but by either throwing science at the wall and seeing what sticks, or by imagining it in their heads, visually, trying to imitate the non-intelligent simulator. Likewise for the automated software development; so much of the thought that human does to do such tasks is, really, unrelated to this human capacity to see meaning and purpose to life, or the symbol grounding or anything of this kind that makes us fearsome, dangerous, survival machines—things you don’t need to make for automated programming software.
I’m not quite sure software is well short of the performance of biological systems in terms of what software can do with given number of operations per second. Consider the cat image recognition: Google’s system has miniscule computing power comparing to human visual cortex, and performs accordingly (badly).
What I suspect though, is that the greatest advances in speeding up technological progress, would come from better algorithm that works on well defined problems like making better transistors—something where even the humans make breakthroughs not by verbally doing some i think therefore i am philosophy in their heads but by either throwing science at the wall and seeing what sticks, or by imagining it in their heads, visually, trying to imitate the non-intelligent simulator. Likewise for the automated software development; so much of the thought that human does to do such tasks is, really, unrelated to this human capacity to see meaning and purpose to life, or the symbol grounding or anything of this kind that makes us fearsome, dangerous, survival machines—things you don’t need to make for automated programming software.