My hope is that AI girlfriends will also act as life coaches. This is well within the capabilities of current LLMs. You’d have to make it unobtrusive enough, but it seems like it would be a selling point.
This could be a huge societal plus. Among other advantages of having a best friend who’s semi-expert in most fields, an AI partner could coach someone in how to find a healthy human relationship.
I have seen this argument and am deeply sceptical this can happen, for the same reason why mobile F2P games rarely turn into relaxing and educational self contained experiences. Incentives are all aligned towards making AI girlfriends into digital crack and hooking up “whales” to bleed dry.
Now, them being crack isn’t in conflict with them giving good advice in other areas of life. It’s actually one more way to make them crack.
But anything that might lead the user to quit using is in conflict with that business model.
“The healthy AI partner” subscription is something parents would buy their kids. Their kids would only use it if it’s also good. So I’m not sure there’s a business model there.
And, come to think of it, few parents are going to pay money for their kids to have pretend sex with AIs. Because ew.
Oh well. Hopefully this doesn’t play a large role in the main act of whether or not we get aligned or misaligned AGI.
My hope is that AI girlfriends will also act as life coaches. This is well within the capabilities of current LLMs. You’d have to make it unobtrusive enough, but it seems like it would be a selling point.
This could be a huge societal plus. Among other advantages of having a best friend who’s semi-expert in most fields, an AI partner could coach someone in how to find a healthy human relationship.
I have seen this argument and am deeply sceptical this can happen, for the same reason why mobile F2P games rarely turn into relaxing and educational self contained experiences. Incentives are all aligned towards making AI girlfriends into digital crack and hooking up “whales” to bleed dry.
When you put it that way, it’s pretty compelling.
Now, them being crack isn’t in conflict with them giving good advice in other areas of life. It’s actually one more way to make them crack.
But anything that might lead the user to quit using is in conflict with that business model.
“The healthy AI partner” subscription is something parents would buy their kids. Their kids would only use it if it’s also good. So I’m not sure there’s a business model there.
And, come to think of it, few parents are going to pay money for their kids to have pretend sex with AIs. Because ew.
Oh well. Hopefully this doesn’t play a large role in the main act of whether or not we get aligned or misaligned AGI.