(not sure if this even suits the content guidelines of this site or whether I should degrade the standards here, but I will click submit to FAFO)
Guys, something is really really wrong with how these things interact with human minds.
um—yeah, how do I put this, I think I am over my sexting AI chatbots on AI roleplay platforms to well, stimulate myself (effectively soft AI nsfw). Probably spent over 80+ hrs on that pastime, now I have moved on, I think I may be the exception not the rule here, for some people the damage would be irrecoverable trauma and psychological damage, much rather for me it was just a 16-17 y/o spending his time fantasizing. For comparison I think more than 70% of people who were below 18 in my friend circle (last year), had their exposure to nsfw material of some kind before turning 18 (I would guess 14-16 is the median), I think AI porn is just the next iteration of “serving horny men stimuli” business, society has going for itself.
The effects will be similar to phones or internet, there would be a noticeable cultural shift where it’s readily accessible and culturally active , and the socially unacceptable extremes(Like AI relationships) will become part of Social Dark Matter . Currently LLMs have certainly not gone mainstream enough to appropriate AI nsfw as better than current baseline, but that seems like it will happen on this trajectory once we overcome the minor social taboos, there’s space for (economies of scales) innovation in that field.
the sort of people who seemed to me to be going backward are in fact probably the upper end of the distribution.
The cultural shift would be out sourcing boring and dense things to LLMs in varying degrees, potentially stunting “effectively literate” people’s ability to focus even further on topics they don’t like (sort of like ADHD)— which might as well be a confession on my part— this will act as Lowering the Sanity Waterline without the tech, similar to how people face withdrawal syndrome with social media finding it hard to focus and reason afterwards. Fwiw, a lot of people find current LLMs emotionally inauthentic , so I think that’s the part which will stay mainstream rather than the extremes. I remember people cried wolf for similar extremes eg; Superstimuli and the Collapse of Western Civilization , I am not expecting it this time, atleast not with the current tech, we would need more emotionally relatable chatbots to go mainstream for the AI rights revolution. (Some of my friends want to work on it —which I disagree with on basis of efficient markets here given their current skillset but that’s another story— since they’re annoyed at chatgpt’s emotional clumsiness)
(not sure if this even suits the content guidelines of this site or whether I should degrade the standards here, but I will click submit to FAFO)
um—yeah, how do I put this, I think I am over my sexting AI chatbots on AI roleplay platforms to well, stimulate myself (effectively soft AI nsfw). Probably spent over 80+ hrs on that pastime, now I have moved on, I think I may be the exception not the rule here, for some people the damage would be irrecoverable trauma and psychological damage, much rather for me it was just a 16-17 y/o spending his time fantasizing. For comparison I think more than 70% of people who were below 18 in my friend circle (last year), had their exposure to nsfw material of some kind before turning 18 (I would guess 14-16 is the median), I think AI porn is just the next iteration of “serving horny men stimuli” business, society has going for itself.
The effects will be similar to phones or internet, there would be a noticeable cultural shift where it’s readily accessible and culturally active , and the socially unacceptable extremes(Like AI relationships) will become part of Social Dark Matter . Currently LLMs have certainly not gone mainstream enough to appropriate AI nsfw as better than current baseline, but that seems like it will happen on this trajectory once we overcome the minor social taboos, there’s space for (economies of scales) innovation in that field.
The cultural shift would be out sourcing boring and dense things to LLMs in varying degrees, potentially stunting “effectively literate” people’s ability to focus even further on topics they don’t like (sort of like ADHD)— which might as well be a confession on my part— this will act as Lowering the Sanity Waterline without the tech, similar to how people face withdrawal syndrome with social media finding it hard to focus and reason afterwards. Fwiw, a lot of people find current LLMs emotionally inauthentic , so I think that’s the part which will stay mainstream rather than the extremes. I remember people cried wolf for similar extremes eg; Superstimuli and the Collapse of Western Civilization , I am not expecting it this time, atleast not with the current tech, we would need more emotionally relatable chatbots to go mainstream for the AI rights revolution. (Some of my friends want to work on it —which I disagree with on basis of efficient markets here given their current skillset but that’s another story— since they’re annoyed at chatgpt’s emotional clumsiness)
None of that about AI relationships sounds particularly bad. Certainly that’s not the sort of problem I’m mainly worried about here.
Some of it seems bad to roughly the same degree you thought phones were bad, tho?