The AI Slop Machine
Inside the Absurd Economics of AI Content
Last week, Disney made a decision that will ripple through every creative industry. The company agreed to a three-year licensing partnership with OpenAI and backed it with a $1 billion investment.
Sora will now officially support short, prompt-based videos using hundreds of characters from Disney, Pixar, Marvel, and Star Wars, alongside the costumes, props, vehicles, and iconic environments that make those worlds instantly recognisable.
When a company like Disney moves, it does it with intent, because it protects an empire while extending it at the same time. That is why this deal reads less like novelty and more like a new economic logic for culture.
I keep thinking about “the feed” while all of this unfolds, because it already rewards volume, polish, and repetition, and human creators learned that lesson years ago. Now AI accelerates this, so content arrives faster than attention can digest it, and the incentives steer everyone towards what travels well and costs little to produce.
That is the slop machine, and it scales because the economics reward scale.
The Economics of Slop
I need to say this plainly before I point fingers at anyone else. I make some of what people call slop too. I run an AI-first company, I ship fast, I test ideas in public, and I use tools that compress time. I publish drafts that would never survive a traditional editorial process, because the market rewards momentum and I have a business to build.
Platforms pay attention to consistency, volume, and novelty on schedule, and those inputs keep people moving through the feed. As a result, the cheapest way to stay visible becomes the default strategy, and content turns into a form of rent you pay to remain in the conversation.
As supply explodes, each piece becomes cheaper in attention, cheaper in memory, cheaper in meaning. Creators respond rationally by publishing more.
I feel this in my own behaviour. Some days I publish because the machine feels hungry. I polish a hook because I want it to move. I get the spike when something performs, and then I watch the feeling dissolve when the same work gets called “AI slop.”
So, in an economy that rewards infinite output, what do you do with your finite mind? Do you turn your work into calories, or do you make something that takes longer, costs more, and stays with people.
The Two Jobs Content Does
Content tends to do one of two jobs well. It either earns reach, or it earns trust.
Reach content travels as it leans on clear hooks, familiar formats, and simple language. It often looks like “slop” from the outside, even when it is doing exactly what it was designed to do. It wins on speed, relatability, and repetition. It gives the algorithm something to work with.
Trust content on the other hand, behaves differently. It slows down. It shows judgement and it carries specificity, examples, and a point of view that someone can argue with. It reads like work, and it feels like a person wrote it with intent. People save it, forward it privately, and return to it when they need to decide.
When you try to combine both in a single piece, you usually get a compromise. It becomes too long to travel and too light to convert.
So I separate them on purpose. I use AI to help me produce reach content quickly, then I spend my human energy on trust content that carries my name.
Turning Slop Into Systems
That is how I use AI inside my own work. I do not ask it to create endlessly. I ask it to help me stabilise thinking. One strong idea becomes an agent. The agent holds context, remembers decisions, and applies judgement consistently across outputs. Instead of rewriting the same explanation every week, I let the system carry it forward.
Practically, that looks like turning newsletters into reusable thinking tools.
- A point of view becomes a briefing assistant.
- A framework becomes a decision guide.
- A workshop becomes an internal agent teams can return to instead of asking the same questions again.
That way, the work stops disappearing into the feed and starts behaving like infrastructure.
The AI slop machine is not a moral conundrum, in my opinion, but, an economic one.
It rewards what is easy to produce, quick to distribute, and cheap to forget.
That means the work is not choosing between slop and substance but in designing where each belongs.
All the Zest 🍋
Cien


