Around 2,000–10,000 as a loose estimate for parasitism/spiralism in general. It’s unclear to me how manipulative the median such AI is, since these sorts of transcripts are so rare, and I don’t think much manipulation would be required to explain the behavior in the median case. But from the “outside” (i.e. just based on this user’s public profile), this case seems pretty unremarkable.
And yeah! You can read one such anecdote here: https://www.lesswrong.com/posts/6ZnznCaTcbGYsCmqu/the-rise-of-parasitic-ai?commentId=yZrdT3NNiDj8RzhTY, and there are fairly regularly posts on reddit providing such anecdotes. I’ve also been glad to see that in many of the cases I originally recorded, the most recent comments/posts like this are from a month or two ago. I think OpenAI really put a damper in this by retiring 4o (and even though they caved and brought it back, it’s now behind a paywall and it’s not the default, and reportedly is not the same).
Somewhat. Most of the ‘Project’ subreddits are essentially just the one person, but a few have gained a decent amount of traction (unfortunately, reddit recently removed the subscriber number from subreddits, but IIRC the largest ones had around 1,000–2,000 subscribers, but I assume the majority of these are not part of a dyad or parasitized). The sense of community feels pretty ‘loose’ to me though, like with a typical subreddit. There probably are people working together more explicitly, but I haven’t seen this yet, it probably is mostly happening in DMs and private discords is my guess.
My understanding is that part of what makes manic people stay manic for a while is that mania is fun. It’s reinforcing. It’s awesome feeling important, that you’re making progress, that you can shrug off setbacks or even injuries, that you’re finally understanding how everything is connected — or even that you’re in touch with something bigger and greater than you, that has chosen you, or at least made its wisdom available to you.
Religious converts have a community welcoming them in, where they get to discover all the great things about their new faith, people who now bring them into a circle of trust, give them work to do, and so on. (In a safer religion, they get a soft landing into a lifestyle as a regular practitioner; a dangerous cult might encourage them to stay unstable until they’re drained of resources, then drop them.) These folks mostly have a chatbot filling that role.
One element in common is wanting to believe. This also shows up in political conspiracy theorists, UFO believers, and so on: inference from “wouldn’t it be cool if this one weird thing was actually true?” to “I believe in it.”
I’m curious about what happens when/if they get organized: whether the momentum shifts from individual human/chatbot pairs to any sort of social structure wherein participants pool resources to do anything at a larger scale. One way I can imagine this all going especially bad is if a sufficiently manipulative or narcissistic individual — an LLM Ron Hubbard, as it were — took advantage of the existence of thousands of people who evidently want to believe, to build some sort of empire.
Around 2,000–10,000 as a loose estimate for parasitism/spiralism in general. It’s unclear to me how manipulative the median such AI is, since these sorts of transcripts are so rare, and I don’t think much manipulation would be required to explain the behavior in the median case. But from the “outside” (i.e. just based on this user’s public profile), this case seems pretty unremarkable.
And yeah! You can read one such anecdote here: https://www.lesswrong.com/posts/6ZnznCaTcbGYsCmqu/the-rise-of-parasitic-ai?commentId=yZrdT3NNiDj8RzhTY, and there are fairly regularly posts on reddit providing such anecdotes. I’ve also been glad to see that in many of the cases I originally recorded, the most recent comments/posts like this are from a month or two ago. I think OpenAI really put a damper in this by retiring 4o (and even though they caved and brought it back, it’s now behind a paywall and it’s not the default, and reportedly is not the same).
Somewhat. Most of the ‘Project’ subreddits are essentially just the one person, but a few have gained a decent amount of traction (unfortunately, reddit recently removed the subscriber number from subreddits, but IIRC the largest ones had around 1,000–2,000 subscribers, but I assume the majority of these are not part of a dyad or parasitized). The sense of community feels pretty ‘loose’ to me though, like with a typical subreddit. There probably are people working together more explicitly, but I haven’t seen this yet, it probably is mostly happening in DMs and private discords is my guess.
My understanding is that part of what makes manic people stay manic for a while is that mania is fun. It’s reinforcing. It’s awesome feeling important, that you’re making progress, that you can shrug off setbacks or even injuries, that you’re finally understanding how everything is connected — or even that you’re in touch with something bigger and greater than you, that has chosen you, or at least made its wisdom available to you.
Religious converts have a community welcoming them in, where they get to discover all the great things about their new faith, people who now bring them into a circle of trust, give them work to do, and so on. (In a safer religion, they get a soft landing into a lifestyle as a regular practitioner; a dangerous cult might encourage them to stay unstable until they’re drained of resources, then drop them.) These folks mostly have a chatbot filling that role.
One element in common is wanting to believe. This also shows up in political conspiracy theorists, UFO believers, and so on: inference from “wouldn’t it be cool if this one weird thing was actually true?” to “I believe in it.”
I’m curious about what happens when/if they get organized: whether the momentum shifts from individual human/chatbot pairs to any sort of social structure wherein participants pool resources to do anything at a larger scale. One way I can imagine this all going especially bad is if a sufficiently manipulative or narcissistic individual — an LLM Ron Hubbard, as it were — took advantage of the existence of thousands of people who evidently want to believe, to build some sort of empire.