Daniel Kokotajlo was the person who originally pointed me to this article. Thank you!
There is no question that human brains have tons of instincts built-in. But there is a hard limit on how much information a single species’ instincts can contain. It is implausible that human beings’ cognitive instincts contain significantly more information than the human genome (750 megabytes). I expect our instincts contain much less.
Human brains definitely have special architectures too, like the hippocampus. The critical question is how important these special architectures are. Are our special architectures critical to general intelligence or are they just speed hacks? If they are speed hacks then we can outrace them by building a bigger computer or writing more efficient algorithms.
There is no doubt that humans transmit more cultural knowledge than other animals. This has to do with language. (More specifically, I think our biology underpinning language hit a critical point around 50,000 years ago.) Complex grammar is not present in any non-human animal. Wernicke’s area is involved. Wernicke’s area could be a special architecture.
How important are the above human advantages? I believe that taking a popular ANN architecture and merely scaling it up will not enable a neural network to compete with humans at StarCraft with equal quantities of training data. If, in addition, the ANN is not allowed to utilize transfer learning then I am willing to publicly bet money on this prediction. (The ANN must be restricted to a human rate of actions-per-second. The ANN does not get to play via an API or similar hand-coded preprocessor. If the ANN watches videos of other players then that counts towards its training data.)
If the ANN can’t use transfer learning, that’s pretty unfair, since the human can. (It’s not like a baby can play Starcraft straight out of the womb; humans can learn Starcraft but only after years of pre-training on diverse data in diverse environments)
Good point. Transfer learning is allowed but it still counts towards the total training data where “training data” is now everything a human can process over a lifetime.
It is implausible that human beings’ cognitive instincts contain significantly more information than the human genome (750 megabytes). I expect our instincts contain much less.
Our instincts contain pointers to learning from other humans, which contain lots of cognitive info. The pointer is small, but that doesn’t mean the resulting organism is algorithmically that simple.
That seems plausible, but AIs can have pointers to learning from other humans too. E.g. GPT-3 read the Internet, if we were making some more complicated system it could evolve pointers analogous to the human pointers. I think.
Daniel Kokotajlo was the person who originally pointed me to this article. Thank you!
There is no question that human brains have tons of instincts built-in. But there is a hard limit on how much information a single species’ instincts can contain. It is implausible that human beings’ cognitive instincts contain significantly more information than the human genome (750 megabytes). I expect our instincts contain much less.
Human brains definitely have special architectures too, like the hippocampus. The critical question is how important these special architectures are. Are our special architectures critical to general intelligence or are they just speed hacks? If they are speed hacks then we can outrace them by building a bigger computer or writing more efficient algorithms.
There is no doubt that humans transmit more cultural knowledge than other animals. This has to do with language. (More specifically, I think our biology underpinning language hit a critical point around 50,000 years ago.) Complex grammar is not present in any non-human animal. Wernicke’s area is involved. Wernicke’s area could be a special architecture.
How important are the above human advantages? I believe that taking a popular ANN architecture and merely scaling it up will not enable a neural network to compete with humans at StarCraft with equal quantities of training data. If, in addition, the ANN is not allowed to utilize transfer learning then I am willing to publicly bet money on this prediction. (The ANN must be restricted to a human rate of actions-per-second. The ANN does not get to play via an API or similar hand-coded preprocessor. If the ANN watches videos of other players then that counts towards its training data.)
If the ANN can’t use transfer learning, that’s pretty unfair, since the human can. (It’s not like a baby can play Starcraft straight out of the womb; humans can learn Starcraft but only after years of pre-training on diverse data in diverse environments)
Good point. Transfer learning is allowed but it still counts towards the total training data where “training data” is now everything a human can process over a lifetime.
Once you add this condition, are current state-of-the-art Starcraft-learning ANNs still getting more training data than humans?
Our instincts contain pointers to learning from other humans, which contain lots of cognitive info. The pointer is small, but that doesn’t mean the resulting organism is algorithmically that simple.
That seems plausible, but AIs can have pointers to learning from other humans too. E.g. GPT-3 read the Internet, if we were making some more complicated system it could evolve pointers analogous to the human pointers. I think.