Social media intensify angertainment exponentially.
AUTHOR’S NOTE: I had already written most of the text on this site when I Read The Chaos Machine by Max Fisher and Ten Arguments for Deleting Your Social Media Accounts by Jaron Lainer. While I was already well aware of the catalytic effect that social media had on angertainment content, I truly did not understand the magnitude of the effect. Nor did I grasp the mindlessness of situation. I did not understand the role social media played in the massacre of the Rohingya people in Myanmar or the murders and firebombing of a Muslim minority in Sri Lanka. I didn’t appreciate the methodology of the YouTube recommendation algorithm and how it can draw people deeper and deeper into wells of misinformation, turning them into white nationalists or Muslim terrorists, by increasing their “engagement.” This section on social media is entirely indebted to the work of these two authors and I encourage anyone who cares about common sense, civility, and democracy to read their books.
Angertainment works like a turbojet engine. It takes bits and pieces of truth, compresses and heats them up in a combustion chamber of outrage, and spews angry, engaging content that propels audiences forward at tremendous speed. But the jet engine analogy only captures half the story. Social media is the afterburner—the secondary system that takes the already-hot exhaust from angertainment and supercharges it into something far more powerful and destructive.
As Max Fisher documents in The Chaos Machine, social media platforms have inadvertently created the technology he describes as “a chaos machine” that systematically undermines social cohesion and democratic norms. In his book Ten Arguments for Deleting Your Social Media Accounts Right Now Jaron Lanier warns that social media platforms are fundamentally designed to modify human behavior in ways that make us more predictable, manipulable, and tribal. Together, these effects create the perfect amplification system for angertainment.
Social media algorithms are designed with one primary objective: maximize engagement. They are sophisticated heat-seeking missiles, constantly scanning the digital landscape for the hottest content—the posts, videos, and articles that generate the most clicks, comments, shares, and reactions. As Lanier explains, the algorithms don’t distinguish between engagement driven by joy, curiosity, or learning, and engagement driven by anger, fear, or hatred. Heat is heat. The system is designed to find what triggers the strongest response and feed users more of it.
Fisher’s research reveals how these engagement-driven algorithms have a systematic bias toward divisive content. Studies conducted by Facebook/Meta’s own researchers found that content expressing outrage generates significantly more engagement than positive or neutral content. The platforms discovered this early on but continued optimizing for engagement because their advertising-based business model depends on capturing and holding people’s attention. Engagement.
Angertainment content burns hotter than almost anything else online. A carefully crafted piece of angertainment—part truth, part speculation, part outrage—can generate thousands of comments and shares within hours. The algorithms detect this thermal signature and immediately begin amplifying the content, pushing it into more feeds, recommending it to more users, and ensuring it reaches maximum distribution.
What makes this system particularly insidious is what Lanier identifies as the fundamental business model of social media: behavior modification for profit.
The platforms aren’t selling social networking or communication services—they’re selling the ability to change how people think and act. Advertisers and political actors pay for access to sophisticated tools that can nudge users toward specific behaviors, purchases, or beliefs.
Angertainment content becomes incredibly valuable in this system because it generates strong, predictable emotional responses. A user who consistently engages with angertainment content becomes increasingly easy to manipulate. The platforms learn their triggers, their fears, their tribal loyalties. This data becomes part of detailed psychological profiles that can be used to influence everything from voting behavior to consumer spending.
Lanier warns that this creates what he calls “continuous behavior modification,” where users are constantly being shaped by algorithmic feedback loops without their awareness or consent. The platforms become increasingly skilled at pushing users toward more extreme positions because extreme users are more engaged, more predictable, and more valuable to advertisers.
Social media doesn’t just amplify angertainment—it transforms it. As content moves through the social media ecosystem, users add their own commentary, their own outrage, their own interpretations. Each share becomes a new injection of fuel into the combustion chamber. A relatively mild piece of angertainment content can become incendiary by the time it has been shared, commented on, and reshared dozens of times.
Fisher documents how this process exploits what psychologists call “social proof”—our tendency to determine what’s true or appropriate based on what other people are doing. When users see thousands of people sharing and commenting on angertainment content, it signals that this content is important, credible, and worthy of attention. The platforms amplify this effect by showing users how many people have engaged with content, creating artificial urgency and social pressure.
The platforms design encourages escalation. The most inflammatory comments often rise to the top through engagement-driven ranking systems. The most extreme reactions get the most attention. Users quickly learn that moderate, thoughtful responses disappear into the digital void, while hot takes and angry rants get likes, shares, and followers. As Lanier notes, this creates a “race to the bottom of the brain stem,” where our most primitive emotional responses are continuously rewarded while our capacity for nuanced thinking atrophies.
This recirculation effect means that even when angertainment producers exercise some restraint—when they include caveats or acknowledge uncertainties—those nuances get stripped away as the content moves through social media. What emerges is often more extreme than what was originally produced.
Social media has perfected something that was once difficult and dangerous: mob formation. In the pre-digital era, assembling an angry mob required physical proximity, shared grievances, and usually some kind of triggering event. Mobs were limited by geography and could be dispersed by distance or time.
Digital mobs operate by an entirely different set of rules. Fisher’s analysis shows how social media platforms have become “social contagion machines” that can spread emotional states, beliefs, and behaviors across vast networks in real-time. They can form instantly, drawing participants from around the globe. They can sustain their anger for weeks or months, feeding on a continuous stream of angertainment content. And they can coordinate their actions with a precision that would have been impossible for physical mobs.
The Gamergate controversy provided an early blueprint for how this works. Angertainment-style content identified specific individuals as enemies—corrupt journalists, feminist critics, industry insiders who supposedly threatened gaming culture. Social media algorithms amplified this content because it generated massive engagement. Online communities formed around shared outrage, developing their own terminology, tactics, and target lists. The mob became self-sustaining, creating its own content and identifying new targets even without direction from the original angertainment sources.
As Lanier observes, these digital mob formations exploit our evolutionary psychology in dangerous ways. Humans evolved in small groups where social rejection could mean death, so were wired to conform to group opinion and punish perceived outsiders. Social media amplifies these ancient impulses while removing the face-to-face interactions that traditionally moderated extreme behavior.
Modern angertainment has refined this mob formation process into a sophisticated targeting system. Angertainment producers don’t need to explicitly call for harassment or violence—they simply need to identify someone as an “enemy” of their audience. They can do this through loaded language, selective reporting, or simply by featuring someone prominently in negative coverage.
Social media takes care of the rest. The algorithms ensure that this enemy identification reaches the maximum audience. Users begin creating and sharing their own content about the target. The recirculation effect amplifies the outrage. And eventually, digital mobs form with a singular focus: making life miserable for the identified enemy.
These mobs can destroy careers, relationships, and mental health with startling efficiency. They can flood someone’s social media accounts with thousands of angry messages. They can contact employers, family members, and friends. They can publish personal information, making harassment move from the digital world into the physical one. All of this happens with little to no coordination from any central authority—the system has become self-executing.
The most insidious aspect of this system is how it traps everyone involved in the process.
Angertainment producers become addicted to the engagement their content generates. They discover that moderate, nuanced content gets ignored, while inflammatory content gets massive reach. The algorithms train them to become more extreme over time.
Social media users become trapped in what Lanier calls “filter bubbles of negativity.” The algorithms, having identified their preferences, feed them an increasingly concentrated diet of angertainment content. Users begin to see the world as a constant series of emergencies, attacks, and betrayals. Their emotional baseline shifts, requiring ever-more-extreme content to generate the same level of engagement.
Fisher’s research documents how this creates a phenomenon he terms “negative social proof”—where users become convinced that society is more divided, more dangerous, and more chaotic than it actually is, simply because that’s what the algorithmic feeds emphasize. The platforms create a distorted reality where conflict and outrage appear to be the dominant human experiences.
Even the platforms themselves become trapped. Their business model depends on engagement, and angertainment content generates more engagement than almost anything else. Despite occasional efforts to reduce the spread of misinformation or harassment, the fundamental incentive structure remains unchanged. The algorithms continue to reward hot content because hot content generates revenue. As Lanier notes, this creates a situation where the platforms profit from social chaos while publicly expressing concern about it.
The combination of angertainment and social media has created a system that is fundamentally incompatible with democratic discourse. Democracy requires the ability to have reasoned debates, to consider multiple perspectives, and to change one’s mind based on new evidence. The angertainment-social media complex punishes all of these behaviors.
Instead, it rewards tribal loyalty, emotional reasoning, and the immediate mobilization of outrage. It transforms citizens into fans; it distills the resolution of complex issues into a team sport, and brands political opponents into enemies deserving of digital mob justice. Fisher argues that this represents a fundamental threat to democratic institutions, as societies lose the shared values foundation required for democratic deliberation.
Lanier goes further, suggesting that social media platforms have created what he calls “continuous tribal warfare” where different groups of users are constantly being manipulated into conflict with each other. The platforms profit from this conflict because it generates the emotional engagement that drives advertising revenue. This creates a perverse incentive system where democratic breakdown becomes a business opportunity.
The afterburner effect of social media has taken angertainment from a troubling media phenomenon to an existential threat to democratic society. Like a jet engine running at full afterburner, it’s powerful, loud, and consuming massive amounts of fuel. But unlike a jet engine, it’s not taking us anywhere we want to go—it’s just creating a lot of heat, noise, and destruction along the way. As both Fisher and Lanier document, this isn’t an accident or an unintended consequence—it’s the predictable result of systems designed to capture human attention and modify human behavior for profit, regardless of the social cost.