LLMs are more accurately described as artificial culture instead of artificial intelligence. They’ve been able to achieve the things they’ve achieved by replicating the secret of our success, and by engaging in much more extensive cultural accumulation (at least in terms of text-based cultural artifacts) than any human ever could. But cultural knowledge isn’t the same thing as intelligence, hence LLMs’ continued difficulties with sequential reasoning and planning.
LLMs are more accurately described as artificial culture instead of artificial intelligence. They’ve been able to achieve the things they’ve achieved by replicating the secret of our success, and by engaging in much more extensive cultural accumulation (at least in terms of text-based cultural artifacts) than any human ever could. But cultural knowledge isn’t the same thing as intelligence, hence LLMs’ continued difficulties with sequential reasoning and planning.