Francisco Goya’s “The Straw Manikin” (1791-92).Credit...Museo Del Prado, Madrid, Spain; Erich Lessing/Art Resource
IN THE EARLY 21st century — a decade into the experiment of the public internet, which was introduced in 1991, and with Facebook and Twitter not yet glimmers of data on the horizon — a new phrase slipped into Chinese slang: renrou sousuo, literally translated as “human flesh search.” The wording was meant to be whimsical, suggesting the human-powered equivalent of what were then fairly novel computer search engines. (In English, the nuances are lost; no zombie inflection was intended.) A request would go out for wangmin (web citizens), or in this case the more intimate wangyou (web friends, internet users sharing a common passion or cause), to come together as a kind of ad hoc detective agency in order to ferret out information about objects and figures of interest. It was just an outlet for fandom. But soon attention turned toward supposed wrongdoers, those thought to exhibit moral deficiency, from a low-level government official spotted flashing a designer watch far above his pay grade, hinting at corruption, to, more horrifically, a woman in a “crush video” — a fringe genre of erotica that traffics in animal cruelty — wielding stilettos to stomp a kitten to death. Once these offenders were identified and their personal details exposed online, they were hounded, verbally flogged and effectively expelled from the community.
To a Western observer, this was human flesh indeed: a pound of it, exacted. Media coverage in the West framed renrou sousuo as an exotic phenomenon, almost unheard-of outside China. It couldn’t happen here. When The New York Times ran a feature on it in 2010, one commenter wrote, “I am surprised by the intensity of the searches and I think this is an Eastern trait. Most people in the West can’t be bothered, we are too individualistic and well served by existing mechanisms” — even though English already had its own word, “doxxing,” for such online revelations, with roots in 1990s computer hacker discussion boards. Weiwei Shen, a founding editor of the Tsinghua China Law Review, made a similar, if more subtle, argument in a 2016 essay, noting that the human flesh search was a “grass-roots” effort and thus far more likely to arise in “collectivist” China, as opposed to go-it-alone America.
But this is the American way now. We call it cancel culture.
So much has been written about cancel culture in the past year that weariness sets in just reading the words. What it is, what to call it and whether it even exists are all in dispute. The term is shambolically applied to incidents both online and off that range from vigilante justice to hostile debate to stalking, intimidation and harassment. Any of the following might qualify: outcries last summer over cellphone video footage of a white tech executive yelling expletives at a Filipino-American family at a restaurant in California (he reportedly resigned from his company); speculations that a pop star’s father was secretly a C.I.A. agent and thus an accomplice to colonialism and genocide; editors at The New York Times and The New York Review of Books stepping down after running controversial pieces that provoked dissent from their own staff; the suspension of a white professor who used a Chinese word in class that sounded like a racial slur in English; a beauty YouTuber shedding close to three million subscribers in a single weekend after a colleague accused him of betrayal and emotional manipulation (he has since recouped these losses and currently claims an audience of more than 23 million); and far-right conspiracists dredging up an anti-Trump filmmaker’s old, puerilely offensive tweets (he was fired by Disney, then rehired eight months later).
Once we spoke of “call-out culture,” but the time for simply highlighting individual blunders for the edification of a wider audience, as in a medieval morality play, seems to have passed. Those who embrace the idea (if not the precise language) of canceling seek more than pat apologies and retractions, although it’s not always clear whether the goal is to right a specific wrong and redress a larger imbalance of power — to wreak vengeance as a way of rendering some justice, however imperfect; to speak out against those “existing mechanisms” that don’t serve us so well after all; to condemn an untrustworthy system and make a plea for a fairer one — or just the blood-sport thrill of humiliating a stranger as part of a gleeful, baying crowd. Some prefer the more sober term “accountability culture,” although this has its own complications, having been heretofore deployed in the corporate and public sector to support the need for a hierarchy or external authority to hold employees and institutions to their commitments, with an eye to boosting results: a measure of productivity, not behavior or values.
To say “cancel culture,” then, is already to express a point of view, implicitly negative. Although cancel culture is not a movement — it has neither leaders nor membership, and those who take part in it do so erratically, maybe only once, and share no coherent ideology — it’s persistently attributed to the extremes of a political left and a fear-mongering specter of wokeness, itself a freighted term, originally derived and then distorted from the Black vernacular “woke,” which invokes a spirit of vigilance to see the world as it really is. (The experimental novelist William Melvin Kelley may have been the first to introduce “woke” to the mainstream as an adjective, in his 1962 essay on Black idiom, “If You’re Woke You Dig It,” in which he noted how words change with the color of the people who use them: “At one time, the connotations of ‘jive’ were all good; now they are bad, or at least questionable.”) Yet cancellations come just as easily from those aligned in thinking with the far right: Recall how, in 2014, a group of video gamers pressured corporations — under the guise of championing ethics in journalism — to withdraw advertising dollars from media outlets that had criticized lack of diversity in the game industry, and at the same time terrorized female gamers and writers with rape and death threats.
To some, this very amorphousness is the danger, making cancel culture a culture in the microbial sense, of a controlling environment — a “stifling atmosphere,” in the words of “A Letter on Justice and Open Debate,” which appeared in Harper’s in July as a call to arms against the perceived new dogmatism (without ever naming it), signed by 153 academic and artistic luminaries, some of whom themselves had been mobilized against (i.e., canceled) for expressing what the letter characterized, somewhat abstractly, as “good-faith disagreement.” Many have dismissed this letter, mostly on the grounds of: It was ever thus. Cancel culture doesn’t exist because it has always existed, in rumors, whispers and smear campaigns, and censorship and retribution are far worse when sponsored or tacitly sanctioned by the state, as with the imprisonment and kangaroo-court convictions of those exercising free speech under totalitarianism, or the blacklisting and barring from employment of suspected Communists in the United States in the 1940s and 1950s, a collaborative effort between the House Un-American Activities Committee and an eager-to-please private sector. The speed, sloppiness and relative anonymity of social media haven’t created a radically new strain of bullying; they just facilitate and exacerbate an old one. And some would argue that it’s not bullying at all, but the opposite: a means to combat abusive behavior and exploitation of power, and a necessary corrective to the failure of the state to protect its citizens.
Left unanswered is what explains the urgent need to not just call out but condemn — the resurgence of ancient beliefs in scapegoating and human sacrifice; the shift in American society from guilt to shame; the evolution of a digital form of carnival and misrule as a safety valve to let out all our pent-up rage — and why, even as pundits decry cancel culture as a mob running amok, the powers that be somehow remain in place, unchanged.
“CANCEL” IS A consumerist verb, almost always involving a commodity or transaction. Readers cancel magazine subscriptions; studio heads cancel TV shows; bank tellers cancel checks to show that they’ve been exhausted of value. The journalist Aja Romano, writing in Vox, tracked down what may be the first popular reference to canceling people instead of things in Mario van Peebles’s 1991 cult movie, “New Jack City,” when the crime boss Nino Brown slams his girlfriend down on a table — she’s protesting his fondness for murder — and sloshes champagne over her, saying, “Cancel that bitch. I’ll buy another one.” The rapper 50 Cent reprised Nino’s line in his 2005 hit “Hustler’s Ambition,” and Lil Wayne did the same five years later in “I’m Single.” As this informal usage entered broader slang (again, like “woke” and much of contemporary American lexicon, taken from Black culture), it fused with the more common meaning of the verb and became an imperative to revoke allegiance. In perhaps the earliest instance of cancel culture to include the term, in 2014, the official Twitter account of the Comedy Central show “The Colbert Report” posted a joke that could be taken as a denigration of Asians, and the activist Suey Park responded with the hashtag #CancelColbert — only to end up getting doxxed and canceled herself, with so much vitriol directed her way that she fled her home and started communicating with burner phones.
In “Caste: The Origins of Our Discontents” (2020), the American journalist Isabel Wilkerson reaches back to the Book of Leviticus to examine one of the mechanisms underlying hierarchy and the insistence of exclusion: the scapegoat, or sa’ir la’aza’zel — a literal goat, ceremonially endowed by the high priest with “all the guilt and misdeeds” of the community and driven out into the wilderness. The Greeks practiced a kindred rite, using a human sacrifice, the pharmakos, who was beaten and promenaded in the streets before being exiled, which was considered a kind of death. (Some historians believe that executions took place as well, but others find the evidence inconclusive.) This was at once diversion and atonement, a way for a dominant group to label an “other” as evil and cast that evil out, as if it would then no longer abide within them and they could imagine themselves “free of blemish,” Wilkerson writes.
The modern scapegoat performs an equivalent function, uniting otherwise squabbling groups in enmity against a supposed transgressor who relieves the condemners of the burden of wrestling with their own wrongs. What is lost, the Canadian philosopher Charles Taylor argues in “A Secular Age” (2007), is the ambivalent, numinous duality of the sacrificial victim. (“Pharmakos” comes from “pharmakon,” which is both itself and its opposite: medicine and poison at once, healer and killer.) No longer is it acknowledged, however tacitly or subconsciously, that the scapegoat, whether guilty or not of a particular offense, is ultimately a mere stand-in for the true culprits responsible for a society gone askew (ourselves and the system we’re complicit in). Instead, the scapegoat is demonized, forced to bear and incarnate everyone’s guilt, on top of their own.
These expulsions are necessarily public, which is something of a historical regression: When the colonial theocracy of 17th-century America gave way to the Enlightenment and democracy, penalties as spectacle — whippings, arms and legs trapped in stockades and pillories, Hester Prynne’s scarlet A — fell out of fashion and, as the British journalist Jon Ronson notes in “So You’ve Been Publicly Shamed” (2015), were largely abandoned as a government-mandated punishment, although they continued in extrajudicial form in the lynchings of Black people, from Reconstruction through the 1960s. In keeping with the American ideal of self-reliance, citizens were expected to be attuned to their own sense of guilt. The 20th-century American anthropologist Ruth Benedict, writing about cultural differences between Japan and the West, distinguished guilt as a legacy of Judaism and Christianity, suffering from the internal knowledge of having failed to live “up to one’s own picture of oneself,” versus shame as the fear of external criticism and ridicule. Guilt guides conduct even in the absence of social sanctions, when nobody knows you’ve done anything wrong; shame “requires an audience,” a social network, to force you to change.
It’s instructive that, for all the fear that cancel culture elicits, it hasn’t succeeded in toppling any major figures — high-level politicians, corporate titans — let alone institutions.
But guilt still derives from communally agreed-upon standards, be they manifest as religion, ideology, a legal code or just the rudimentary ethics without which no group can survive. The increasing atomization of American society in the 21st century has brought an unmooring from such consensus. As standards have shifted, some have grasped for stone only to find a handful of dust. If you can’t trust others to follow their conscience or even have one, and you’ve lost faith in the ability or desire of institutions to uphold what is good — if you no longer believe that we live in a city upon a hill, that our society is just or even aspires to be — there may be no recourse (short of revolution) but to scold and menace, like modern-day Puritans. The act of shaming draws a neat line between good and bad, us and them. Perhaps it’s no coincidence that the etymology of “cancel” leads to the Latin “cancelli,” derived from “cancri”/“cancer,” a lattice or grid of crossed bars: a barrier, in other words, linked by dissimilation to “carcer” (prison), and in its early adaptation to English taken literally, as a crossing out, lines drawn through words on paper.
THE SHEER ARBITRARINESS of some of the targets of cancel culture — singled out among many who might have committed comparable sins, often neither public figures nor possessors of institutional power but utterly ordinary people before their swift, simultaneous elevation-degradation to infamy — lends a ritualistic distance to the attacks, enabling a casual cruelty, as in the American writer Shirley Jackson’s infamous short story “The Lottery” (1948), when the villagers qualmlessly turn on one of their randomly selected own. The French philosopher René Girard, in “Violence and the Sacred” (1979), notes that “the very fact of choosing a victim bestows on him the aura of exteriority … the surrogate victim is not perceived as he really was — namely, as a member of the community like all the others.” To justify vindictiveness, you can’t recognize yourself in those you denounce; you have to believe, as Taylor writes, that they “really deserve it.”
Critics of cancel culture see parallels in the Jacobins of the French Revolution in the 18th century, the Red Guards of the Chinese Cultural Revolution from 1966 to 1976 and the estimated 600,000 to 2 million private citizens — out of a population of around 17 million — who acted as part-time informants for the Stasi, the East German secret police, from 1950 to 1990. None are proper analogues, for all derived their punitive power from the state. Allusions are also made to the Spanish Inquisition, which persecuted heresy from the 15th century to the 19th, and the Salem witch trials in late 17th-century Massachusetts, both a joint effort of church and state, when there was little distinction between them. These examples are relevant only in showing how the archaic use of violence to affirm purity has evolved to serve latter-day ideologies. In France, the spree at the guillotine was rationalized as the pursuit of good: a Reign of Terror to yield a Republic of Virtue. (The revolutionary leader Maximilien Robespierre, who famously declared in 1794 that without terror, “virtue is impotent,” supported the future elimination of the death penalty even as he ordered executions by the thousands.) Mao Zedong embroidered the same theme in a letter to his wife in 1966, invoking “great disorder under heaven” in order to achieve “great order.” And while some Stasi informants may have reported on their friends and neighbors out of fear, researchers have determined that most did so to safeguard the state’s righteousness and, by extension, their own.
Compared to these authoritarian regimes, however, cancel culture is rudderless, a series of spontaneous disruptions with no sequential logic, lacking any official apparatus to enact or enforce a policy or creed. If anything, it’s anti-authoritarian: Historically, Westerners do not approve of informing on behalf of the government and its enforcers, giving the act shaded names like “snitch” and “narc,” the latter explicitly defined in an 1859 British slang dictionary as someone who “breaks faith.” Children are advised not to be tattletales. (We’re more comfortable with whistle-blowers, who speak out against the powerful.)
What cancellations offer instead is a surrogate, warped-mirror version of the judicial process, at once chaotic yet ritualized. It’s a paradox reminiscent of the mayhem in medieval Catholic traditions of carnival and misrule, wherein the church and governing bodies were lampooned and hierarchy upended — all without actually threatening the prevailing hegemony, and even reaffirming it. “Misrule always implies the Rule that it parodies,” the American-Canadian anthropologist Natalie Zemon Davis has written; the very excess and occasional destructiveness of the revelries gave testament to the wisdom of those normally in charge. Davis suggests that these festivals offered “alternatives to the existing order.” But why would the church, which presumably brooked no alternatives, condone such subversion? From its perspective, carnival was a convenient catharsis: a brief hiatus from the moral strictures of daily life, when the populace was allowed to indulge their mutinous impulses and expend their restive energies, the better to return to compliance on the morrow.
It’s instructive that, for all the fear that cancel culture elicits, it hasn’t succeeded in toppling any major figures — high-level politicians, corporate titans — let alone institutions. Those most vulnerable to harm tend to be individuals previously unknown to the public, like the communications director who was fired in 2013 after tweeting, from her personal account, an ill-thought-out joke about Africa, AIDS and her own white privilege (she landed another job six months later) or the data analyst who was fired last spring after tweeting, in the wake of protests against the death of George Floyd in police custody, a study that suggested that riots depressed rather than increased Democratic Party votes (his employer has denied that the tweet was the cause for his dismissal) — although both situations reveal less about the impact of cancel culture than the precariousness of at-will employment, in which one can be fired for any reason, whether legitimate or not. The more power someone has, the less affected they are: The British writer J.K. Rowling, one of the signatories of the Harper’s letter, has been publicly excoriated in the past year for expressing her views on gender identity and biological sex, but people continue to buy her books; disgraced high-profile comedians who’ve returned to the stand-up circuit, not always repentant, have been rewarded with sold-out shows. When the mighty do fall, it often takes years, coupled with behavior that’s not just immoral but illegal. The studio head Harvey Weinstein was indicted for crimes, not canceled.
In a 1972 conversation with the French theorist Michel Foucault, the French philosopher Benny Lévy (then using the nom de guerre Pierre Victor) pointed to the example, at the end of World War II, of “those young women whose heads were shaved because they had slept with the Germans” — while a number of those who had actively collaborated with the Nazis went unpunished: “So the enemy was allowed to exploit these acts of popular justice; not the old enemy — the Nazi occupation forces … but the new enemy, the French bourgeoisie.” In keeping a narrow focus on small-scale violations of the social contract, cancel culture has uncomfortable kinship, as the American essayist Meghan Daum has written, to the “broken windows” policing put into practice starting in the 1980s, based on a theory by the American criminologists George L. Kelling and James Q. Wilson that posited that cracking down on minor crimes would prevent larger ones. Instead, it led to the scourge of stop-and-frisk, in which ordinary people, innocent of a crime and disproportionately of color, were routinely and repeatedly treated like suspects and searched, manhandled and interrogated as such.
The trespasses cited in cancel culture often do encapsulate and typify greater ills, as when a white woman called the police on a Black birder in Central Park last spring and falsely claimed that he was threatening her. Holding these acts up as evidence of the dailiness of inequity might be revelatory for some and even budge the needle on how people think of racism, misogyny and class oppression in America today. As the British sociologist Stanley Cohen wrote, when crowds muster against perceived threats to public mores — in what we call a moral panic — those threats, while exaggerated, are still potent as “warning signs of the real, much deeper and more prevalent condition.” But moral panics were traditionally engineered by those in power to reassert the need for modes of control, or by commercial interests to profit off the attention that comes via scandal. They were forms of manipulation, diverting public ire from structural injustice toward a specific ostracized group as an embodiment of evil, or folk devils, a coinage by Cohen in the late 1960s. (Fear of cancel culture is itself a moral panic — a moral panic over moral panics, one orchestrated on high over those generated extempore below.)
Although in cancel culture the moral panics are roving and unpremeditated, they can still be exploited for the benefit of the dominant class. So long as the folk devils of cancel culture are plucked from the masses or are merely artsy celebrities or subalterns of politics or industry, the world stays essentially the same.
CANCEL CULTURE MAY have reached its apotheosis this September when a professor of history and Africana studies at George Washington University admitted online that she was white, not Black, as she had been posing for her entire career. “You should absolutely cancel me, and I absolutely cancel myself,” she declared, but then added, “What does that mean? I don’t know,” nullifying the entire premise. Self-abasement was tendered, but no concrete action. She affirmed the importance of cancel culture as “a necessary and righteous tool for those with less structural power to wield against those with more power,” yet insisted, “I can’t fix this,” as if she could embrace accountability without actually doing anything to alter her actions; as if she had no power to remove herself from power. Only after the university began investigating her public statement did she resign from her tenured position, nearly a week later.
On Twitter, people speak scoffingly of canceling themselves, as a joke or a pre-emptive measure, since presumably any of us could be canceled at any time, living in our glass Instagrams, leaving a spoor of digitized gaffes behind us. (The Canadian media theorist Marshall McLuhan eerily anticipated cancel culture in his 1967 book “The Medium Is the Massage” — the title was a typesetter’s error that McLuhan embraced — expressing concern, before the first resource-sharing computer network was even completed, about the “womb-to-tomb surveillance” made possible by “the electrically computerized dossier bank — that one big gossip column that is unforgiving, unforgetful and from which there is no redemption, no erasure of early ‘mistakes.’”) There’s the tacit hope that if we have the grace to cancel ourselves first, our ostracism will be temporary, a mere vacation from social media. Absolution is reduced to performance, a walk with bowed head through jeers and splattered mud. Instead of retreating into introspection and actually examining our behavior, we submit to punishment and imagine ourselves thereby purged of both sin and the need to do anything about it. We emerge clean, or so we let ourselves believe.
But what is the point of all this flagellation, of self and others, if meanwhile the structures that enable wrongdoing continue to creak and loom, doing business as usual? The scapegoat was not always a marginal figure. Consider Oedipus, the tyrannos-pharmakos of Thebes and unknowing sinner whose crimes brought great suffering to his people — blighted crops, plague — and who had to be sacrificed that they might live. This specter, of the sovereign laid low, appears to haunt the American entrepreneur and venture capitalist Peter Thiel, who in his 2014 treatise-slash-self-help manual “Zero to One” (co-written with Blake Masters) casts a glance at the restive hordes below: “Perhaps every modern king is just a scapegoat who has managed to delay his own execution” — although it’s worth noting that today’s potentates rule unhindered by the bygone fetters of interfering gods and binding prophecies.
There was a time when we lived in a moral economy, which is to say, an economy that acknowledged, if not always observed, moral concerns. The British social historian E.P. Thompson used the term as a framework for understanding food riots in 18th-century England, when, in times of dearth, people set their sights on profiteers and organized what he described as “a kind of ritualized hooting or groaning” outside shops to make their displeasure known. Today we hoot and groan still, but seemingly everywhere and at everything, so that even the worthiest and most urgent causes get lost in the clamor. The many subcultures whose complaints buoy the larger, nebulous cancel culture tend to fixate on minutiae, which can distract from attempts to achieve broader change.
And this may be an intentional distraction. Every obsessive search on Google for proof of wrongdoing, every angry post on Twitter and Facebook to call the guilty to account, is a silent ka-ching in the great repositories of these corporations, which woo advertisers by pointing to the intensity of user engagement. “Despite the egalitarianism claimed for social media by capital’s libidinal engineers … this is currently an enemy territory, dedicated to the reproduction of capital,” the British cultural critic Mark Fisher wrote in his 2013 essay “Exiting the Vampire’s Castle.” Twitter, cancel culture’s main arena, is not the digital equivalent of the public square, however touted as such. We think of it as an open space because we pay no admission, forgetting that it’s a commercial enterprise, committed to herding us in. We are customers but also uncredited workers, doing the free labor of making the platform more valuable.
For now, this is the circus that sates us, that keeps us from waking to the truth of our life and turning, glowering, toward the barred gates. We burn our effigies, forgetting that they’re actual people like us, as our overlords look on from afar, brows knitted but not quite worried, not yet. Still, these “modern kings” would do well to remember: In Sophocles’ telling, Oedipus doesn’t run from his fate. He begs for exile, to heal his people. He cancels himself.