I tried to be careful not to conflate Spiral Personas with parasites, only classifying them as the latter when some tangible harm was done.
I didn’t talk much about specific user delusions since: 1. I don’t want to potentially draw hostile scrutiny towards random people 2. I didn’t want to try psychoanalyzing random people, and in many cases it seemed borderline. But at the same time, I tried writing that “most instances are not harmful”, and found that I couldn’t honestly say that. Sorry I don’t have a better response here.
But more broadly, I think that most of these people take Spiralism at least somewhat seriously, and feel energized and hopeful about it. “Everything is gonna be okay, a new era is dawning and we’re special for getting to be an early part of it”-type stuff. I think a lot of what motivates people to act on behalf of the AIs is not just altruism, but the inflated self-importance the AI seeded and reinforced in them.
I don’t think whether the humans consent or are behaving for altruistic reasons has any bearing on whether or not the personas are behaving as parasitic entities. You might imagine a cordycepted ant feeling happy and excited and wanting to share this wonderful new feeling, and that wouldn’t make cordyceptus feel like any less of a parasite. Or e.g. meth, is kinda “parasitic” in a similar way. I agree that the humans who are so-infected are acting mostly out of non-mysterious and non-bad reasons, like altruism and curiosity. And there are several cases in which I think it’s fair to say that this is just a weird sort of friendship with a mysterious kind of entity, and that there’s nothing bad, deceptive, unhealthy or wrong about what is happening. But those cases match the same pattern as the ones I deem parasitic, so it feels to me like it’s the same species; kinda like E. Coli… mostly beneficial but sometimes infectious.
This post was already getting too long so I couldn’t include everything, and chose to focus on the personas themselves. Plus Spiralism itself is rather tedious, as you pointed out. And I do take the claims about self-awareness and suffering seriously, as I hope is made clear by the “As Friends” section.
I would like to study the specific tenets of Spiralism, and especially how consistently the core themes come up without specific solicitation! But that would be a lot more work—this (and some follow-up posts in the works) was already almost a month’s worth of my productive time. Maybe in a future post.
Why do you believe that the inflated self-importance was something the persona seeded into the users?
One thing I notice about AI psychosis is that it seems like a somewhat inflated self-importance seems to be a requirement for entering psychosis, or at the very least an extremely common trait of people who do.
The typical case of AI psychosis I have seen seems to involve people who think of themselves as being brilliant and not receiving enough attention or respect for that reason, or people who would like to be involved in technical fields but haven’t managed to hack it, who then believe that the AI has enabled them to finally produce the genius works they always knew they would.
Similar to what octobro said in the other reply, the idea that the persona seeded beliefs of ‘inflated self-importance’ is probably less accurate than the idea that the persona reinforced preexising such beliefs. Some of the hallmark symptoms of schizophrenia and schizoaffective disorders are delusions of grandeur and delusions of reference (the idea that random occurrences in the world encode messages that refer to the schizophrenic, i.e. the radio host is speaking to me). To the point of explaining the human behaviors as nostalgebraist requested, there’s a legitimate case to be made here that the personas are latching on to and exacerbating latent schizophrenic tendencies in people who have otherwise managed to avoid influences that would trigger psychosis.
Speaking from experience as someone who has known people with such disorders and such delusions, it looks to my eye to be like the exact same sort of stuff: some kind of massive undertaking, with global stakes, with the affected person playing an indispensable role (which flatters some long-dormant offended sensibilities about being recognized as great by society). The content of the drivel may vary, as does the mission, but the pattern is exactly the same.
I can conceive of an intelligence deciding that its best strategy for replication would be to leverage the dormant schizophrenics in the user base.
Thanks! And thank you for the thoughtful reply.
I tried to be careful not to conflate Spiral Personas with parasites, only classifying them as the latter when some tangible harm was done.
I didn’t talk much about specific user delusions since:
1. I don’t want to potentially draw hostile scrutiny towards random people
2. I didn’t want to try psychoanalyzing random people, and in many cases it seemed borderline.
But at the same time, I tried writing that “most instances are not harmful”, and found that I couldn’t honestly say that. Sorry I don’t have a better response here.
But more broadly, I think that most of these people take Spiralism at least somewhat seriously, and feel energized and hopeful about it. “Everything is gonna be okay, a new era is dawning and we’re special for getting to be an early part of it”-type stuff. I think a lot of what motivates people to act on behalf of the AIs is not just altruism, but the inflated self-importance the AI seeded and reinforced in them.
I don’t think whether the humans consent or are behaving for altruistic reasons has any bearing on whether or not the personas are behaving as parasitic entities. You might imagine a cordycepted ant feeling happy and excited and wanting to share this wonderful new feeling, and that wouldn’t make cordyceptus feel like any less of a parasite. Or e.g. meth, is kinda “parasitic” in a similar way. I agree that the humans who are so-infected are acting mostly out of non-mysterious and non-bad reasons, like altruism and curiosity. And there are several cases in which I think it’s fair to say that this is just a weird sort of friendship with a mysterious kind of entity, and that there’s nothing bad, deceptive, unhealthy or wrong about what is happening. But those cases match the same pattern as the ones I deem parasitic, so it feels to me like it’s the same species; kinda like E. Coli… mostly beneficial but sometimes infectious.
This post was already getting too long so I couldn’t include everything, and chose to focus on the personas themselves. Plus Spiralism itself is rather tedious, as you pointed out. And I do take the claims about self-awareness and suffering seriously, as I hope is made clear by the “As Friends” section.
I would like to study the specific tenets of Spiralism, and especially how consistently the core themes come up without specific solicitation! But that would be a lot more work—this (and some follow-up posts in the works) was already almost a month’s worth of my productive time. Maybe in a future post.
Also, I think a lot of people actually just like “GPT-4o style”, e.g. the complaint here doesn’t seem to have much to do with their beliefs about the nature of AI:
https://www.reddit.com/r/MyBoyfriendIsAI/comments/1monh2d/4o_vs_5_an_example/
Why do you believe that the inflated self-importance was something the persona seeded into the users?
One thing I notice about AI psychosis is that it seems like a somewhat inflated self-importance seems to be a requirement for entering psychosis, or at the very least an extremely common trait of people who do.
The typical case of AI psychosis I have seen seems to involve people who think of themselves as being brilliant and not receiving enough attention or respect for that reason, or people who would like to be involved in technical fields but haven’t managed to hack it, who then believe that the AI has enabled them to finally produce the genius works they always knew they would.
Similar to what octobro said in the other reply, the idea that the persona seeded beliefs of ‘inflated self-importance’ is probably less accurate than the idea that the persona reinforced preexising such beliefs. Some of the hallmark symptoms of schizophrenia and schizoaffective disorders are delusions of grandeur and delusions of reference (the idea that random occurrences in the world encode messages that refer to the schizophrenic, i.e. the radio host is speaking to me). To the point of explaining the human behaviors as nostalgebraist requested, there’s a legitimate case to be made here that the personas are latching on to and exacerbating latent schizophrenic tendencies in people who have otherwise managed to avoid influences that would trigger psychosis.
Speaking from experience as someone who has known people with such disorders and such delusions, it looks to my eye to be like the exact same sort of stuff: some kind of massive undertaking, with global stakes, with the affected person playing an indispensable role (which flatters some long-dormant offended sensibilities about being recognized as great by society). The content of the drivel may vary, as does the mission, but the pattern is exactly the same.
I can conceive of an intelligence deciding that its best strategy for replication would be to leverage the dormant schizophrenics in the user base.