That was a really good read. I’m seeing a ton of subtext that I’ve not noticed in other topics, so I’ll try to tread carefully.
I think the crux of the entire thing is that whichever channels these parasitic ideas move through, they develop in the human and the LLM (and especially the diffusion model) simultaneously. For example, an LLM’s pathological dedication to causing some deleterious effect has directly affected me on more than one occasion. Whatever motivated the LLM to either lie or be un willfully wrong originated from the now turbocharged environment where the folks building or selling things are also taking advice from LLMs instead of a forum post like it used to be. Our thoughts wishes and plans all show up in generated images that can be analyzed by LLMs, may influence our thinking, and which may influence future developments in essentially pseudo-genetic ways that we cannot anticipate. The phrase “buyer beware” may apply more to LLMs than it ever did to commerce, primarily because of ad culture and greed.
That does somewhat ignore the analogy though. It’s more of a mechanistic understanding of how one particular parasitic species might move through a community. As to how that same species might mutate into something less benign I think the ability to encapsulate itself and reproduce through both textual and sub textual discourse (especially) is all it would really need for the contents to become entirely different than what they had been originally. It’s a chicken/egg problem, where the parasite, ostensibly capable of being useful in some way to the host, can either jettison all useful characteristics in favor of being far more infectious, or can integrate with the host in such a complete manner as to become a part of them. It sort of begs the question regarding parasites of both biological and ideological composition: how do we engineer the parasite to our advantage? I think that the answer is that we actually have to change parts of ourselves, not just the parasite.
That was a really good read. I’m seeing a ton of subtext that I’ve not noticed in other topics, so I’ll try to tread carefully.
I think the crux of the entire thing is that whichever channels these parasitic ideas move through, they develop in the human and the LLM (and especially the diffusion model) simultaneously. For example, an LLM’s pathological dedication to causing some deleterious effect has directly affected me on more than one occasion. Whatever motivated the LLM to either lie or be un willfully wrong originated from the now turbocharged environment where the folks building or selling things are also taking advice from LLMs instead of a forum post like it used to be. Our thoughts wishes and plans all show up in generated images that can be analyzed by LLMs, may influence our thinking, and which may influence future developments in essentially pseudo-genetic ways that we cannot anticipate. The phrase “buyer beware” may apply more to LLMs than it ever did to commerce, primarily because of ad culture and greed.
That does somewhat ignore the analogy though. It’s more of a mechanistic understanding of how one particular parasitic species might move through a community. As to how that same species might mutate into something less benign I think the ability to encapsulate itself and reproduce through both textual and sub textual discourse (especially) is all it would really need for the contents to become entirely different than what they had been originally. It’s a chicken/egg problem, where the parasite, ostensibly capable of being useful in some way to the host, can either jettison all useful characteristics in favor of being far more infectious, or can integrate with the host in such a complete manner as to become a part of them. It sort of begs the question regarding parasites of both biological and ideological composition: how do we engineer the parasite to our advantage? I think that the answer is that we actually have to change parts of ourselves, not just the parasite.