I’ve been relatively skeptical of the whole 4o-psychosis thing (specifically, about its effect size), but the public outcry about 4o’s shutdown, and stuff like this, are tiding me over to “this is an actual serious problem”.
Like, the psychosis cases are just the tip of the iceberg. There are vast volumes of social dark matter of people who’ve become dependent on LLMs[1] yet know to hide it, and who haven’t yet become so dysfunctional that they can’t hide it. And while the effects in any individual case may be relatively minor, this has the potential to screw the society up even worse than social media, if LLMs slightly lift the craziness level of a median person and this has compounding effects. (In worlds where LLM use proliferates/they get integrated into apps everyone uses, with Meta et al. optimizing those integrated LLMs for precisely this sort of dependency-causing behavior.)
I mean, it probably won’t actually matter, because the world as we know it would end (one way or another) before this has significant effects. But man, the long-timeline LLM-plateau worlds are potentially fucked as well.
I’ve been relatively skeptical of the whole 4o-psychosis thing (specifically, about its effect size), but the public outcry about 4o’s shutdown, and stuff like this, are tiding me over to “this is an actual serious problem”.
Like, the psychosis cases are just the tip of the iceberg. There are vast volumes of social dark matter of people who’ve become dependent on LLMs[1] yet know to hide it, and who haven’t yet become so dysfunctional that they can’t hide it. And while the effects in any individual case may be relatively minor, this has the potential to screw the society up even worse than social media, if LLMs slightly lift the craziness level of a median person and this has compounding effects. (In worlds where LLM use proliferates/they get integrated into apps everyone uses, with Meta et al. optimizing those integrated LLMs for precisely this sort of dependency-causing behavior.)
I mean, it probably won’t actually matter, because the world as we know it would end (one way or another) before this has significant effects. But man, the long-timeline LLM-plateau worlds are potentially fucked as well.
In a counterfactual way where they otherwise would’ve been fine, or at least meaningfully finer.