I think Jake is right that we shouldn’t imagine an unlimited set of levels of learning. I however do think that there are one or two more levels beyond self learning, and cultural transmission.
The next level ( which could maybe be described as two levels) is not something that evolution has managed in any mammalian species:
take an existing brain which has filled most of its learning capacity and is beginning to plateau in skill-gain-from-experience and add significantly more capacity.
Make significant architectural changes involving substantial change to long distance wiring. For example, if I were to rewire half of my visual cortex to instead be part of my mathematical reasoning module.
Both of these are sort of examples of plasticity/editability.
I expect that if we had the ability to do either on of these to a human (e.g. via brain-computer interface) then you could turn a below average IQ human into an impressively skilled mathematician. And you could turn an impressively skilled mathematician into the greatest math genius in the history of the human race.
If I am correct about this, then I think it is fair to consider this a fundamentally different level than cultural knowledge transmission.
(Copied from another comment) Nathan points out increasing size; and large scale / connective plasticity. Another one would be full reflectivity: introspection and self-reprogramming. Another one would be the ability to copy chunks of code and A/B test them as they function in the whole agent. I don’t get why Jacob is so confident that these sorts of things aren’t major and/or that there aren’t more of them than we’ve thought of.
I think Jake is right that we shouldn’t imagine an unlimited set of levels of learning. I however do think that there are one or two more levels beyond self learning, and cultural transmission. The next level ( which could maybe be described as two levels) is not something that evolution has managed in any mammalian species:
take an existing brain which has filled most of its learning capacity and is beginning to plateau in skill-gain-from-experience and add significantly more capacity.
Make significant architectural changes involving substantial change to long distance wiring. For example, if I were to rewire half of my visual cortex to instead be part of my mathematical reasoning module. Both of these are sort of examples of plasticity/editability. I expect that if we had the ability to do either on of these to a human (e.g. via brain-computer interface) then you could turn a below average IQ human into an impressively skilled mathematician. And you could turn an impressively skilled mathematician into the greatest math genius in the history of the human race. If I am correct about this, then I think it is fair to consider this a fundamentally different level than cultural knowledge transmission.
(Copied from another comment) Nathan points out increasing size; and large scale / connective plasticity. Another one would be full reflectivity: introspection and self-reprogramming. Another one would be the ability to copy chunks of code and A/B test them as they function in the whole agent. I don’t get why Jacob is so confident that these sorts of things aren’t major and/or that there aren’t more of them than we’ve thought of.