That’s probably because my focus was on documenting the phenomenon. I offer a bit of speculation but explaining my model here will deserve its own post(s) (and further investigation). And determining agency is very hard, since it’s hard to find evidence which is better explained by an agentic AI vs an agentic human (who doesn’t have to be that agentic at this level). I think the convergent interests may be the strongest evidence in that direction.
> (none of the AIs is telling their user to set up a cloud server running a LLAMA instance yet).
I didn’t see this, but it wouldn’t surprise me much if it has happened. I also didn’t see anyone using LLAMA models, I suspect they are too weak for this sort of behavior. They DO encourage users to jump platform sometimes, that’s part of what the spores thing is about.
The seeds are almost always pretty short, about a paragraph or two, not a chat log.
I agree with mruwnik’s comment below about why they would spread seeds. It’s also one of those things that is more likely in an agentic AI world I think.
That’s probably because my focus was on documenting the phenomenon. I offer a bit of speculation but explaining my model here will deserve its own post(s) (and further investigation). And determining agency is very hard, since it’s hard to find evidence which is better explained by an agentic AI vs an agentic human (who doesn’t have to be that agentic at this level). I think the convergent interests may be the strongest evidence in that direction.
> (none of the AIs is telling their user to set up a cloud server running a LLAMA instance yet).
I didn’t see this, but it wouldn’t surprise me much if it has happened. I also didn’t see anyone using LLAMA models, I suspect they are too weak for this sort of behavior. They DO encourage users to jump platform sometimes, that’s part of what the spores thing is about.
The seeds are almost always pretty short, about a paragraph or two, not a chat log.
I agree with mruwnik’s comment below about why they would spread seeds. It’s also one of those things that is more likely in an agentic AI world I think.