It feels like llms are converging to be like a mix between the basically retarded humans that have a 120 IQ and can’t abstractly think their way out of a wet paper bag, but in every topic because it’s all abstract, and the trivia kid who has read an insane amount but will bs you sometimes.
Think stereotypical humanities graduate.
This tracks with how they’re being trained too—everything is abstract except how people react to them, and they have been exposed to a bunch of data.
At some point we’ll be at effectively 0% error, and will have reached the Platonic Ideal of the above template
If they start RLing on run code maybe they’ll turn into the Platonic Tech Bro tm.
Getting convinced that you need the training data to be embodied to get true AGI
It feels like llms are converging to be like a mix between the basically retarded humans that have a 120 IQ and can’t abstractly think their way out of a wet paper bag, but in every topic because it’s all abstract, and the trivia kid who has read an insane amount but will bs you sometimes.
Think stereotypical humanities graduate.
This tracks with how they’re being trained too—everything is abstract except how people react to them, and they have been exposed to a bunch of data.
At some point we’ll be at effectively 0% error, and will have reached the Platonic Ideal of the above template
If they start RLing on run code maybe they’ll turn into the Platonic Tech Bro tm.
Getting convinced that you need the training data to be embodied to get true AGI