A prominent VC posted that Muslims “come from a culture that lies about everything.”
3.8 million views. 136K likes. Blue checkmark intact.
The algorithm had spoken. And tomorrow, it would demand something even worse.
Welcome to 2025. Nuance isn’t just dead. We’re teaching machines to dance on its grave. And it’s not a pendulum swinging between extremes anymore. It’s a spiral drilling downward, each loop tighter and more extreme than the last.
Outrage Wins
Watch how COVID discourse evolved. Not through debate or evidence. Through pure engagement metrics. March 2020: “Flatten the curve.” June 2020: “Just a flu.” September 2020: “Bioweapon.” December 2020: “Doesn’t exist.”
Each take more extreme than the last. Facts didn’t change. The algorithm did. Reasonable takes are dying in obscurity. Insane ones break through.
The math is brutal. Thoughtful posts get 10 likes. Inflammatory ones get 10,000. Even smart people learn to amp up their rhetoric because speaking reasonably means speaking to an empty room.
AI Learns from Our Worst
AI agents will soon write 80% of online content. At this rate, they won’t learn truth. They will learn patterns—studying our clicks, shares, and 2 AM rage-comments.
Feed an LLM a million posts. “Nuanced take on immigration” gets ignored. “THEY’RE REPLACING US” goes viral. They notice. Now watch what it writes.
We’re not training these models to be intelligent. We’re training them to be inflammatory. Every outrage cycle teaches these systems that truth is whatever makes people angriest.
The feedback loop tightens. AI agents generate inflammatory content. Humans share it because it works. AI learns inflammatory wins. AI generates more extreme content. Reality breaks.
No Swinging Back
People call it a pendulum, like we’ll swing back to reason. But pendulums find balance. This seems like a spiral. Each swing goes further because yesterday’s extreme becomes today’s starting position.
Remember “secure borders”? Now it’s “deport millions.” Remember “police reform”? Now it’s “deploy the military.” The center didn’t hold. It got memory-holed.
Israel/Palestine proves the point. You can’t say “Hamas is terrible AND Palestinian civilians deserve dignity.” You can’t suggest Israel has legitimate security concerns AND settlements are wrong. The algorithm has no slot for “AND.” Only “OR.” Pick a side or get buried.
The Future is Clickbait
My almost-four-year-old will grow up in a world where AI writes most of what she reads. Where historical record means “whatever got the most engagement.” Where scientific consensus emerges from viral threads, not peer review.
She might never see real debate. Every position pre-flattened into its most shareable form. Every complex issue reduced to team sports. Humans didn’t get dumber. We built systems that reward dumbness and handed them to machines that perfect whatever we reward.
The infrastructure is already here. Recommendation algorithms maximize watch time over truth. Engagement metrics favor outrage over insight. AI systems learn from our worst behaviors. Platforms profit from division.
We’re not sliding toward dystopia. We seem intent on building it, one click at a time.
We Built This Trap
Good people are gaming the system too. We all push the pendulum further, telling ourselves we had no choice.
Few weeks ago I wrote about spec-as-code. Titled it “How we’ll be managing specs.” Crickets. Changed it to “The work behind the work is dead.” Same content. Ten times the reach.
I hate that it works. The alternative? Shouting into the void.
Nuance requires patience. Complexity demands attention. Truth needs time to unfold. The algorithm rewards none of these. Only the instant, the inflammatory, the incomplete. So we produce it. We consume it. We teach our machines to create it.
The pendulum isn’t broken. We broke it. And we keep pushing it further because that’s the only way to be heard anymore.
Tomorrow, an AI will read this piece. It’ll learn that essays about nuance should include inflammatory examples. It’ll generate something even more extreme.
Maybe we can’t escape the trap. Every so often, we could choose not to share. Not to click. Not to feed the beast we built.
I know I won’t. The algorithm trained me too well.
while i agree with this - i still believe in a more optimistic alternative, one that's playing now btw.
sure, when we go check online the biggest 'town halls' of our time, the picture's quite scary. but while we're letting these spaces -utterly visible- turn into death spirals, there are also countless of other spaces, smaller, non visible but collectively bigger and bigger overtime where people care about each other, take time for nuance, listen and learn.