Slop, as I’ve generally seen it, refers specifically to AI-generated slop. You mention as much in your article. However, it seems like in your article, despite the fact that you do recognize that this term is generally used to refer to AI-generated slop at large, you seem to argue generally in favor of the concept that more people being able to produce more works will lead to good things. And I think that if those works were as they have been prior to, let’s say, five years ago, when the influx of slop began, then I may agree with you. If there were simply easier ways for smaller creators to get seen or for people to get started with less resources, I would agree. But I don’t think that what’s happening is people being able to enter into things with less resources. I think what’s happening is that larger amounts of attention are being able to be captured by lower amounts of effort. And that’s the main worry, when people are being negative about slop: If it no longer requires more than a few minutes worth of effort to generate 100 videos that could potentially entertain 80 out of 100 people, that means that it’s possible to make money and it’s possible to capture attention with far higher efficiency and far lower effort overall as compared to pre-slop.
And that fact means that low effort content will be a much higher volume of what everyone sees on the Internet because it’s able to capture attention on those feeds as well as or better than things that have more effort involved, A rising tide of slop raises all boats, possibly, yes, but most of those boats are slop.
Anecdotal evidence: I recently found a channel that generated about 10 videos per day, all within the theme of ‘Simpsons episode recaps’ using ‘screenshots’ of ‘episodes’ that were really just AI-generated images, and all of which more or less had to do with Elon Musk visiting Springfield, or another plot involving Elon Musk, or involving some sort of popular figure visiting Springfield.
Some of these videos were truly horrendous, hilariously bad or scary stuff, just absolutely incomprehensible imagery, nonsensical scripts that were obviously written by ChatGPT on a cheap plan, etc.,
But despite the fact that out of probably over a thousand videos that they had uploaded, almost all were doing extremely poorly, every once in a while one of the videos would hit thousands or hundreds of thousands of views, thus making it profitable for this person to spam YouTube with 15 slop videos a day on the off chance that they get one big hit, because the slop happened to be particularly engaging, or the script ended up being just coherent enough to seem plausible, and the images just convincing enough to seem real. I don’t think that most people would believe that this sort of channel is beneficial to anyone, except for those who want cheap entertainment and don’t care about the validity of what they’re seeing, or how it was produced.
Also, generally on these platforms, everyone has to create a huge glut of content in the hopes that they get one or two big videos, because you could just get unlucky and not get more than a couple eyeballs on your stuff, even if you’ve made something truly good or worthwhile, and the algorithm particularly favors regular uploads. Which means that this increased volume from all the slop is making it harder for people to get seen, not easier, as you seem to suggest.
yeah, I really think people should be more clear about when they are linking to their own paywalled substack. shouldn’t this be a linkpost at very least?