Could part of the issue with hallucination in LLMs be that they are often trained on Internet conversations where no one is likely to chime in with “I don’t know”?
Could part of the issue with hallucination in LLMs be that they are often trained on Internet conversations where no one is likely to chime in with “I don’t know”?