define a general intelligence as one that doesn’t perform stupidly in a novel environment
What does “stupidly” mean in this context?
It also seems to me you’re setting up a very high bar, one that you yourself admit (individual) humans generally can’t reach. If they can’t, why set it so high, then? Since we can’t get to even human-level intelligence at the moment, there doesn’t seem to much sense in speculating about designing even harder tests for AIs.
What does “stupidly” mean in this context?
It also seems to me you’re setting up a very high bar, one that you yourself admit (individual) humans generally can’t reach. If they can’t, why set it so high, then? Since we can’t get to even human-level intelligence at the moment, there doesn’t seem to much sense in speculating about designing even harder tests for AIs.