I agree that the AIs are pretty bad at handling conceptually confusing stuff. I basically think of them as being incredibly knowledgeable, not that smart, and having huge amounts of intuition on how to program (mostly due to their knowledge and their having read huge amounts of code).
My guess is that for any reasonable operationalization of “raw intelligence”, they’re getting smarter?
I agree that the AIs are pretty bad at handling conceptually confusing stuff. I basically think of them as being incredibly knowledgeable, not that smart, and having huge amounts of intuition on how to program (mostly due to their knowledge and their having read huge amounts of code).
My guess is that for any reasonable operationalization of “raw intelligence”, they’re getting smarter?