I feel you should give a strict definition of general intelligence (artificial or not). If you define it as “an intelligence that can solve all solvable problems with constrained resources”, it’s not clear that that humans have a general intelligence. (For instance, you yourself argue that humans might be unable to solve the problem of creating AGI.) But if humans don’t have a general intelligence, then creating an artificial human-level intelligence might be easier than creating an AGI.
How strict do you want the definition to be? :) A lax definition would be efficient cross domain optimization. A mathematically rigorous definition would be the updateless intelligence metric (once we solve logical uncertainty and make the definition truly rigorous).
Roughly, general intelligence is the ability to efficiently solve the average problem where the averaging is done over a Solomonoff ensemble.
I feel you should give a strict definition of general intelligence (artificial or not). If you define it as “an intelligence that can solve all solvable problems with constrained resources”, it’s not clear that that humans have a general intelligence. (For instance, you yourself argue that humans might be unable to solve the problem of creating AGI.) But if humans don’t have a general intelligence, then creating an artificial human-level intelligence might be easier than creating an AGI.
Hi Daniel, thx for commenting!
How strict do you want the definition to be? :) A lax definition would be efficient cross domain optimization. A mathematically rigorous definition would be the updateless intelligence metric (once we solve logical uncertainty and make the definition truly rigorous).
Roughly, general intelligence is the ability to efficiently solve the average problem where the averaging is done over a Solomonoff ensemble.