As another suggestive example, kids growing up exposed to grammatical language will learn that language, but kids growing up not exposed to grammatical language will simply create a new grammatical language from scratch, as in Nicaraguan Sign Language and creoles. (Try training an LLM from random initialization, with zero tokens of grammatical language anywhere in its training data or prompt. It’s not gonna spontaneously emit grammatical language!) I think that’s a good illustration of why imitation learning is just entirely the wrong way to think about what’s going on with brain algorithms and brain-like AGI.
Isn’t this because humans have a hard coded “language instinct”?
It sounds like you’re suggesting that inventing grammar is the convergent result of a general competency?
It sounds like you’re suggesting that inventing grammar is the convergent result of a general competency?
There are some caveats, but more-or-less, yeah. E.g. the language-processing parts of the cortex look pretty much the same as every other part of the neocortex. E.g. some people talk about how language is special because it has “recursion”, but in fact we can also handle “recursion” perfectly well in vision (e.g. we can recognize a picture inside a picture), planning (e.g. we can make a plan that incorporates a sub-plan), etc.
Isn’t this because humans have a hard coded “language instinct”?
It sounds like you’re suggesting that inventing grammar is the convergent result of a general competency?
There are some caveats, but more-or-less, yeah. E.g. the language-processing parts of the cortex look pretty much the same as every other part of the neocortex. E.g. some people talk about how language is special because it has “recursion”, but in fact we can also handle “recursion” perfectly well in vision (e.g. we can recognize a picture inside a picture), planning (e.g. we can make a plan that incorporates a sub-plan), etc.