One way in which the difference between LLM psychosis and misbelief shows: a substantial number of people are under the impression that LLMs are authoritative sources of truth. They don’t know anything about ML, they know ChatGPT is a really big deal, and they haven’t heard about hallucinations. Under those circumstances, it’s clear that no distorted thinking is needed for them to believe the LLM is correct when it tells them they have an important breakthrough.
One way in which the difference between LLM psychosis and misbelief shows: a substantial number of people are under the impression that LLMs are authoritative sources of truth. They don’t know anything about ML, they know ChatGPT is a really big deal, and they haven’t heard about hallucinations. Under those circumstances, it’s clear that no distorted thinking is needed for them to believe the LLM is correct when it tells them they have an important breakthrough.