There’s no such thing as “a domain where LLMs are particularly likely to hallucinate”. In every domain there’s some obscure jagged boundary, not very far from normal standard questions to ask, where LLMs will hallucinate, usually plausibly to a non-expert.
There’s no such thing as “a domain where LLMs are particularly likely to hallucinate”. In every domain there’s some obscure jagged boundary, not very far from normal standard questions to ask, where LLMs will hallucinate, usually plausibly to a non-expert.