To say that we have to exactly copy a human brain to produce true intelligence, if that is what Knapp and Stross are thinking, is anthropocentric in the extreme. Did we need to copy a bird to produce flight? Did we need to copy a fish to produce a submarine? Did we need to copy a horse to produce a car? No, no, and no. Intelligence is not mystically different.
I think he doesn’t understand what those people are saying. Nobody doubts that you don’t need to imitate human intelligence to get artificial general intelligence but that a useful approximation of AIXI is much harder than understanding human intelligence.
AIXI is as far from real world human-level general intelligence as an abstract notion of a Turing machine with an infinite tape is from a supercomputer with the computational capacity of the human brain. An abstract notion of intelligence doesn’t get you anywhere in terms of real-world general intelligence. Just as you won’t be able to upload yourself into the Matrix because you showed that in some abstract sense you can simulate every physical process.
I think he doesn’t understand what those people are saying. Nobody doubts that you don’t need to imitate human intelligence to get artificial general intelligence but that a useful approximation of AIXI is much harder than understanding human intelligence.
AIXI is as far from real world human-level general intelligence as an abstract notion of a Turing machine with an infinite tape is from a supercomputer with the computational capacity of the human brain. An abstract notion of intelligence doesn’t get you anywhere in terms of real-world general intelligence.
It seems to me that you are systematically underestimating the significance of this material. Solomonoff induction (which AIXI is based on) is of immense theoretical and practical significance.
Michael Anissimov wrote:
I think he doesn’t understand what those people are saying. Nobody doubts that you don’t need to imitate human intelligence to get artificial general intelligence but that a useful approximation of AIXI is much harder than understanding human intelligence.
AIXI is as far from real world human-level general intelligence as an abstract notion of a Turing machine with an infinite tape is from a supercomputer with the computational capacity of the human brain. An abstract notion of intelligence doesn’t get you anywhere in terms of real-world general intelligence. Just as you won’t be able to upload yourself into the Matrix because you showed that in some abstract sense you can simulate every physical process.
It seems to me that you are systematically underestimating the significance of this material. Solomonoff induction (which AIXI is based on) is of immense theoretical and practical significance.