By the way my memory fails me: what exactly does Joy say about AI risk? What is his angle? If I recall correctly he cites the dangers of robots, not of superintelligence.
E.g. the word “superintelligence(ent)” only appears once in Bill Joy’s famous essay “Why the future doesn’t need us”, and that in a Moravec quote. “Robot(ics)” appears 52 times.
“If they are smarter than us, stronger than us, evolve quicker than us, they are likely to out-evolve us—in the same way that we have taken over the planet and out-evolved most of the other creatures”—source.
It might so inform—but Gates has a brain between his ears and mouth—and these concepts are likely to be old and familiar ones for him—so internal concept processing also seems fairly likely.
By the way my memory fails me: what exactly does Joy say about AI risk? What is his angle? If I recall correctly he cites the dangers of robots, not of superintelligence.
E.g. the word “superintelligence(ent)” only appears once in Bill Joy’s famous essay “Why the future doesn’t need us”, and that in a Moravec quote. “Robot(ics)” appears 52 times.
He says—of the “robots”:
“If they are smarter than us, stronger than us, evolve quicker than us, they are likely to out-evolve us—in the same way that we have taken over the planet and out-evolved most of the other creatures”—source.
Still, that doesn’t tell me why Gates said “superintelligent computers” rather than “highly-evolved robots”
Give a superintelligence some actuators and it becomes a robot. A superintelligence without actuators is not much use to anyone.
The point is that Gates’s turn of phrase is informative about the provenance of his ideas.
It might so inform—but Gates has a brain between his ears and mouth—and these concepts are likely to be old and familiar ones for him—so internal concept processing also seems fairly likely.
An oracle built with solid-state hard drives and no cooling fans would not be of use to anyone?