I think it’s unlikely that AIs are talking people who otherwise wouldn’t have developed psychosis on their own into active psychosis. AIs can talk people into really dumb positions and stroke their ego over “discovering” some nonsense “breakthrough” in mathematics or philosophy. I don’t think anyone disputes this.
But it seems unlikely that AIs can talk someone into full-blown psychosis who wouldn’t have developed something similar at a later time. Bizarre beliefs aren’t the central manifestation of psychosis, but are simply the most visible symptom. A normal human who is talked into a bizarre belief would look something like Terrance Howard, who is functional but spends some of his spare time trying to prove that 1*1=2. He is still gainfully employed and socially integrated. He is not psychotic. I wouldn’t be surprised if syncophantic LLMs can talk someone normal into acting like Terrance Howard, at least for a while. But that isn’t psychosis.
My understanding of schizophrenia is that the first emotionally traumatizing or psychadelic event in adulthood causes some sort of mental shift that results in schizophrenia for the rest of their life. This could be a breakup, LSD, marijuana, or a highly sycophantic LLM. But even high doses of any of those wouldn’t cause a life-long tendency towards psychosis in a normal human. I doubt LLMs are different. Thus, it may be more accurate to say that LLMs, through extreme sycophancy, may be the trigger for psychosis, instead of “LLMs cause psychosis”.
I think it’s unlikely that AIs are talking people who otherwise wouldn’t have developed psychosis on their own into active psychosis. AIs can talk people into really dumb positions and stroke their ego over “discovering” some nonsense “breakthrough” in mathematics or philosophy. I don’t think anyone disputes this.
But it seems unlikely that AIs can talk someone into full-blown psychosis who wouldn’t have developed something similar at a later time. Bizarre beliefs aren’t the central manifestation of psychosis, but are simply the most visible symptom. A normal human who is talked into a bizarre belief would look something like Terrance Howard, who is functional but spends some of his spare time trying to prove that 1*1=2. He is still gainfully employed and socially integrated. He is not psychotic. I wouldn’t be surprised if syncophantic LLMs can talk someone normal into acting like Terrance Howard, at least for a while. But that isn’t psychosis.
My understanding of schizophrenia is that the first emotionally traumatizing or psychadelic event in adulthood causes some sort of mental shift that results in schizophrenia for the rest of their life. This could be a breakup, LSD, marijuana, or a highly sycophantic LLM. But even high doses of any of those wouldn’t cause a life-long tendency towards psychosis in a normal human. I doubt LLMs are different. Thus, it may be more accurate to say that LLMs, through extreme sycophancy, may be the trigger for psychosis, instead of “LLMs cause psychosis”.