I think verbosity is learned through corpus curation. They’ve been using the casual conversation tone to train chat models for awhile now. Even the earlier AI chat bot prototypes around 15 years ago were using the same type of conversational verbosity. This is just how they want to model the AI for chat, being mostly user friendly HCI type of thing. About 5 years ago, there was the news article summary AI/LLM that I think GPT-3 uses, also chatGPT, to reduce big texts into entities and reduce some sentences down to their first order logic through NLP indicators of keywords and parts of speech. Maybe it wasn’t LLM, I didn’t look into the code itself.
I think in general AGI can’t take into account of context that aren’t parametrized. For instance, the same texts under different context (e.g. different entities involved, different time period and settings, characteristics of the entities themselves, etc.) That’s what separate these machines from biological intelligence. If you can model everything an biological organism experiences into parameters that you can feed into an AI, then you can achieve AGI, else, the more data you are missing the further away you are from AGI.
I think verbosity is learned through corpus curation. They’ve been using the casual conversation tone to train chat models for awhile now. Even the earlier AI chat bot prototypes around 15 years ago were using the same type of conversational verbosity. This is just how they want to model the AI for chat, being mostly user friendly HCI type of thing. About 5 years ago, there was the news article summary AI/LLM that I think GPT-3 uses, also chatGPT, to reduce big texts into entities and reduce some sentences down to their first order logic through NLP indicators of keywords and parts of speech. Maybe it wasn’t LLM, I didn’t look into the code itself.
I think in general AGI can’t take into account of context that aren’t parametrized. For instance, the same texts under different context (e.g. different entities involved, different time period and settings, characteristics of the entities themselves, etc.) That’s what separate these machines from biological intelligence. If you can model everything an biological organism experiences into parameters that you can feed into an AI, then you can achieve AGI, else, the more data you are missing the further away you are from AGI.