Kurzweil (and gwern in a cousin comment) both think that “effort will be allocated efficiently over time” and for Kurzweil this explained much much more than just Moore’s Law.
Ray’s charts from “the olden days” (the nineties and aughties and so on) were normalized around what “1000 (inflation adjusted) dollars spent on mechanical computing” could buy… and this let him put vacuum tubes and even steam-powered gear-based computers on a single chart… and it still worked.
The 2020s have basically always been very likely to be crazy. Based on my familiarity with old ML/AI systems and standards, the term “AGI” as it was used a decade ago was already reached in the past. Claude is already smarter than most humans, but (from the perspective of what smart, numerate, and reasonable people predicted in 2009) he is (arguably) overbudget and behind schedule.
Kurzweil (and gwern in a cousin comment) both think that “effort will be allocated efficiently over time” and for Kurzweil this explained much much more than just Moore’s Law.
Ray’s charts from “the olden days” (the nineties and aughties and so on) were normalized around what “1000 (inflation adjusted) dollars spent on mechanical computing” could buy… and this let him put vacuum tubes and even steam-powered gear-based computers on a single chart… and it still worked.
The 2020s have basically always been very likely to be crazy. Based on my familiarity with old ML/AI systems and standards, the term “AGI” as it was used a decade ago was already reached in the past. Claude is already smarter than most humans, but (from the perspective of what smart, numerate, and reasonable people predicted in 2009) he is (arguably) overbudget and behind schedule.