Maybe the young man also misses intimacy and the feeling that somebody understands him and appreciates him even more than he misses sex
Well, maybe. But this seems a stronger assumption; we are basically considering someone with an unsupportive family and no close friends at all (someone could object “I suffer because my supportive friends are not pretty girls”, but I would still consider that as a proxy for “I miss sex”). Also, “No one tells me that I’m good so I’ll set up a bot” is something that would mark this person as a total loser, and I’m skeptical that lots of people will do it despite the obvious associated social stigma. I would rather expect this kind of AI usage to follow dynamics similar to those of alcoholism (the traditional way to forget that your life sucks). I would also tentatively say that isolating yourself with an AI companion is probably less harmful than isolating yourself with a whiskey bottle.
Anyway, I’m not arguing in favor of totally unregulated AI companion apps flooding the market. I agree that optimizing LLMs for being as addictive as possible when imitating a lover sounds like a bad idea. But my model is that the kind of people who would fall in love with chatbots are the same kind of people who would fall in love with plain GPT prompted to act like a lover. I’m not sure about how much additional damage we will get from dedicated apps… especially considering that plain GPT is free but AI companion apps typically require subscription (and even our IQ 90 people should be able to recognize as “not a normal relationship” something that gets abruptly interrupted if you don’t pay 20$/month).
Well, maybe. But this seems a stronger assumption; we are basically considering someone with an unsupportive family and no close friends at all (someone could object “I suffer because my supportive friends are not pretty girls”, but I would still consider that as a proxy for “I miss sex”). Also, “No one tells me that I’m good so I’ll set up a bot” is something that would mark this person as a total loser, and I’m skeptical that lots of people will do it despite the obvious associated social stigma. I would rather expect this kind of AI usage to follow dynamics similar to those of alcoholism (the traditional way to forget that your life sucks). I would also tentatively say that isolating yourself with an AI companion is probably less harmful than isolating yourself with a whiskey bottle.
Anyway, I’m not arguing in favor of totally unregulated AI companion apps flooding the market. I agree that optimizing LLMs for being as addictive as possible when imitating a lover sounds like a bad idea. But my model is that the kind of people who would fall in love with chatbots are the same kind of people who would fall in love with plain GPT prompted to act like a lover. I’m not sure about how much additional damage we will get from dedicated apps… especially considering that plain GPT is free but AI companion apps typically require subscription (and even our IQ 90 people should be able to recognize as “not a normal relationship” something that gets abruptly interrupted if you don’t pay 20$/month).