The Orwell issue of The Village Voice included a short story by Bob Brewin called “Worldlink 2029,” in which “obriens” work for a global computer network that is somewhere between an advanced telescreen and a primitive internet. “The worst kind of Big Brother,” wrote Brewin, was “a machine with no soul run by men who had come close to turning into machines themselves.” As early as 1949, Tribune had tied its review of Nineteen Eighty-Four to a news item about the ominous implications of a new mechanical “brain” developed at Manchester University. Now the fictional popularity of the all-powerful computer—Fate in V for Vendetta, Skynet in The Terminator—reflected public concerns about databases, satellites and surveillance cameras. It was exactly this mounting anxiety that made Chiat/Day want to “smash the old canard that the computer will enslave us” and herald a new era of Apple-driven techno-utopianism. It was also why Walter Cronkite wrote in a New York Times op-ed to promote his CBS special 1984 Revisited: “If Big Brother could just get all the major private and government data banks in America linked, he might be 80 percent of the way home.” The New York Times television critic broadly agreed with Cronkite’s diagnosis but thought that he had missed something important: “the complaisance, the eagerness even, with which we embrace the new technologies.”
This was an apprehension that flew in the face of Apple’s “1984.” What if loss of freedom didn’t require a Big Brother or an Ingsoc? What if we did it to ourselves?
CHAPTER 13
Oceania 2.0
Nineteen Eighty-Four in the Twenty-First Century
The stubbornness of reality is relative. Reality needs us to protect it.
—Hannah Arendt, 1951
In 1984, during a panel discussion on Nineteen Eighty-Four, the American media critic Neil Postman argued that television had radically transformed culture, politics and human behaviour in America in a way that more closely resembled Brave New World than Orwell’s book. He developed this theory into a potent polemic called Amusing Ourselves to Death: “Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us. This book is about the possibility that Huxley, not Orwell, was right.” This striking line appears in the final chapter: “In the Huxleyan prophecy, Big Brother does not watch us, by his choice. We watch him, by ours.” Postman did not expect to be taken literally.
Big Brother, the reality television show that debuted in the Netherlands in 1999, flowed from the realisation that while people still claimed to be concerned about surveillance, a significant number would happily volunteer for it. In 1996, a Pennsylvania college student called Jennifer Ringley installed a webcam in her dorm room and “lifecasted” her every move via her enormously popular JenniCam website. Three years later, the eccentric internet entrepreneur Josh Harris went several steps further by staging an art-project-cum-social experiment called Quiet: We Live in Public. He invited more than one hundred volunteers to live in a six-storey Manhattan warehouse, equipped with all the food, intoxicants and entertainment they could require, and told them that they were free to do as they pleased, on the understanding that it was all recorded by an arsenal of webcams. Harris produced a living metaphor for what the internet would become: a place where people enthusiastically bartered privacy for pleasure, convenience and attention. “I loved living in a world with no secrets and no sense of time,” said one volunteer, “where we were little children, being taken care of.” Both Harris’s and Ringley’s projects were quickly branded “Orwellian.”
If Quiet was the avant-garde expression of a powerful idea, then Big Brother was the primetime version: a social experiment that frequently devolved into a voyeuristic freakshow. Its Dutch creator, John de Mol Jr., was coy about the source of the series title, but when the format arrived in the US in 2000, the name of the production company rather gave the game away: Orwell Productions, Inc. Lawyer William F. Coulson filed a lawsuit on behalf of Marvin Rosenblum and the Orwell estate, accusing the US programme-makers of “dilution and cheapening of the distinctive quality of this mark.” Coulson was referring to the value of the screen rights but the show did something similar to Orwell’s ideas. In Big Brother, housemates live under twenty-four-hour surveillance (“asleep or awake, working or eating, indoors or out of doors, in the bath or in bed,” to quote the novel) and are summoned to the Diary Room (known in some territories as the Confession Room) on behalf of a non-existent Big Brother. In most versions of the show, books and writing implements are forbidden. “Orwell understood the difference between ‘what the public is interested in’ and ‘the public interest,’ ” wrote Orwell’s outraged biographer Bernard Crick. “That is why he wrote that book whose warning has been treated with cynical contempt and is itself treated as ‘prolefeed.’ ” Around the same time, the BBC’s Room 101 rebuilt Orwell’s torture chamber as a cute repository for celebrities’ pet hates.
Not all references to Nineteen Eighty-Four during the 1990s were lightweight. Room 101 was the main character’s address in The Matrix, a 1999 film steeped in questions of freedom, society and the nature of reality, while quotes from the novel still packed a punch in “Testify” by Rage Against the Machine and “Faster” by the Manic Street Preachers. Nevertheless, it felt as if the book might ultimately be trivialised, ironised and, like Winston Smith, squeezed empty. This could only have happened in the decade of end-of-history complacency, when intelligent people could suggest with conviction that Orwell’s warning had worked. “The world of Nineteen Eighty-Four ended in 1989,” wrote Timothy Garton Ash in May 2001. Orwell remained an essential guide to the obfuscations and deceptions of political language, Garton Ash allowed, but his unholy trinity—imperialism, fascism and communism—had fallen: “Forty years after his own painful and early death, Orwell had won.”
Four months later, two passenger jets were flown into the World Trade Center.
In 2003, George Orwell’s centenary, with its inevitable biographies, reissues, conferences and documentaries, took place in a world divided by the US-led invasion of Iraq. Perhaps that’s why, in a listener poll for BBC Radio 4, Nineteen Eighty-Four was voted the quintessential English book, ahead of friendlier shortlisted works by Zadie Smith, Jeremy Paxman, Bill Bryson and Jonathan Coe. “Nineteen Eighty-Four is about power out of control,” commented Bernard Crick. “Maybe people are feeling a sense of horror about two ‘Big Brothers’ who cannot be controlled, or perhaps three. We must throw in Saddam alongside Bush and Blair.”
Critics of the war scrambled for Nineteen Eighty-Four. Paul Foot in The Guardian decried the “doublethink” of “Oceania (the US and Britain).” Radiohead’s album Hail to the Thief opened with a fierce, panicky song called “2 + 2 = 5,” provoked by the “Orwellian euphemisms” that frontman Thom Yorke heard on the news. The Bush administration’s post-9/11 policies loomed large in the documentaries Orwell Rolls in His Grave and Orwell Against the Tide, while Michael Moore’s polemic Fahrenheit 9/11 closed with a paraphrased passage from Goldstein’s book: “The war is waged by the ruling group against its own subjects and its object is not the victory over either Eurasia or Eastasia but to keep the very structure of society intact.” The idea of an interminable “war on terror” certainly brought to mind Oceania, where every restriction is justified because “there is a war on.” Life mirrored art to an alarming degree when a senior aide to President Bush (later identified as Karl Rove, though he has denied it) told The New York Times that the administration had nothing to fear from “the reality-based community . . . who believe that solutions emerge from your judicious study of discernible reality. That’s not the way the world really works anymore. We’re an empire now, and when we act, we create our own reality.” Reading those words, you could almost hear O’Brien’s voice. As a popular slogan put it, Nineteen Eighty-Four was not meant to be a how-to manual.
At the same time, hawks such as Norman Podhoretz and Christopher Hitchens, united by the war against “Islamofascism” twenty years after crossing swords in Harper’s, deployed Orwell’s words to sh
ame their opponents on the left. This tactic went beyond the Iraq war; conservatives routinely threw the term “Thought Police” at anyone who advocated “politically correct” language. The obsession with imagining what Orwell might have said about current events was breeding resentment and fatigue. The political scientist Scott Lucas, the author of two harsh, revisionist books about the writer, distinguished Orwell the man from “Orwell” the symbol: “ ‘Orwell’ has been used as a stick to beat those whose opinions are perceived as troublesome or in any way threatening.” Daphne Patai, one of the most respected authorities on dystopian literature, shared Lucas’s impatience to be rid of “Saint George” and see Orwell as a complex, contradictory figure rather than a moral exemplar. “Shakespeare doesn’t have the moral authority to give us an opinion on the invasion of Iraq,” she said in 2003. “No one would have dreamed of such a thing, but Orwell does get cited for that.”
For many creators of new dystopian fiction, meanwhile Nineteen Eighty-Four remained the tallest building in the city of nightmares; one didn’t have to enter it, but one couldn’t entirely ignore it. In 1Q84, Haruki Murakami tweaked Orwell’s title (nine and Q are homophones in Japanese); set the action in 1984, beginning in April; and made noisy reference to Orwell in the context of parallel universes and religious cults. The protagonist of Gary Shteyngart’s Super Sad True Love Story, a satire on corporate excess and intellectual decline, is a worn-out, thirty-nine-year-old diarist in love with a cynical younger woman. James McTeigue, director of the 2005 V for Vendetta movie, paid tribute by casting John Hurt as dictator Adam Sutler (the name could have been subtler), who berates his underlings from a giant screen, thus turning Michael Radford’s Winston Smith into a thuggish Big Brother. Politically sophomoric and visually inert, the movie nonetheless resonated widely when cheap plastic versions of V’s Guy Fawkes mask became a global emblem of protest. “V was designed to warn against a grim possibility—like a kind of 1984 in comics,” said David Lloyd, the artist responsible for the design. “And as George Orwell’s message was one that reached out to a wide readership because it spoke of universal matters of importance to us all, it’s no surprise that ours did as well.”45
The most resonant twenty-first-century dystopias, however, were notable for their distance from Orwell. Works as diverse as Kazuo Ishiguro’s novel Never Let Me Go, Suzanne Collins’s Young Adult series The Hunger Games, Mike Judge’s savage comedy Idiocracy and the Pixar movie Wall-E satirised decadent capitalism rather than totalitarianism.46 Philip Roth denied that The Plot Against America, his novel about an alternate timeline where the aviator Charles Lindbergh defeats President Roosevelt in the 1940 election and institutes fascism in America, had much in common with Nineteen Eighty-Four: “Orwell imagined a huge change in the future with horrendous consequences for everyone; I tried to imagine a small change in the past with horrendous consequences for a relative few.” The most striking dystopia of the 2000s was Children of Men, Alfonso Cuarón’s alchemical film adaptation of P. D. James’s 1992 novel. The movie’s near-future England is mean, tawdry and violent but incapable of totalitarianism. Despite surveillance cameras and concentration camps, the prevailing mood is of chaos rather than control, and the furniture of capitalism remains in place, albeit faded and threadbare, because in a world where no babies have been born for eighteen years, there is literally no future. Cuaron’s landscape of exhausted possibilities felt more relevant to the new century’s anxieties, especially after the 2008 financial crisis, than Orwell’s all-powerful tyranny.
So, too, did British screenwriter Charlie Brooker’s anthology TV series Black Mirror, which became the definitive dystopia of the 2010s because it expressed up-to-the-minute anxieties about our insufficiently examined reliance on technology. Each episode takes a current tendency—reality TV, social media, virtual reality, politics as showbusiness—to Swiftian extremes. “Any time there’s a new invention, people say, ‘Oh, that’s a bit Black Mirror,’ ” said Brooker in 2016. They were missing the point. The theme of Black Mirror, as Huxley said of Brave New World, is “not the advancement of science as such; it is the advancement of science as it affects human individuals.” Neil Postman’s line about Huxley’s book—“what we love will ruin us”—could serve as a motto for Brooker’s dystopias of complicity. In HBO’s Black Mirror–ised 2018 version of Fahrenheit 451, the book-burning tyranny is the result of an alliance between government and tech companies. “The Ministry didn’t do this to us,” says one character. “We did it to ourselves. We demanded a world like this.”
There is truth in that. The currency of the twenty-first-century tech industry is data. All but the cagiest internet users routinely tell companies such as Facebook and Google what they like, who they know, where they go, and much more. The writer Rebecca Solnit calls Google “Big Hipster Brother.” She wrote about another of those companies, Apple, on the thirtieth anniversary of its most famous commercial: “Maybe Apple’s ‘1984’ ad is the beginning of Silicon Valley’s fantasy of itself as the solution, not the problem—a dissident rebel, not the rising new Establishment.” Citing government surveillance, hacking, revenge porn and iPhone addiction, Solnit argued that the “Orwell was wrong” triumphalism of the 1980s had been at best premature, if not dishonest. Shaped by powerful corporations with a commercial and philosophical disdain for privacy, online culture “wasn’t a rupture with the past but an expansion of what was worst about that past . . . 2014 has turned out quite a bit like 1984.”
Dave Eggers explored such misgivings in his 2013 novel The Circle. The story of a young woman called Mae Holland’s initiation into the monolithic tech company of the title is an agile satire of Silicon Valley utopianism with sly nods to its predecessors. The famous triad of slogans from Nineteen Eighty-Four is rewritten for the social-media age: “SECRETS ARE LIES / SHARING IS CARING / PRIVACY IS THEFT.” The earthy refusenik driven to his death by a voyeuristic mob recalls John the Savage at the end of Brave New World. The Circle’s ultimate goal of “transparency”—living one’s entire life in public, in “a new and glorious openness, a world of perpetual light”—makes Zamyatin’s glass houses and Orwell’s telescreens look primitive. The bulk of the novel is simply an exaggeration of current trends. Only in its very last chapter does it become a true dystopia in which ownlife has been abolished without any need for force: Mae proves her love for Big Brother by effectively turning her life into a Big Brother house. “What happens to us if we must be ‘on’ all the time?” asked Margaret Atwood in her review. “Then we’re in the twenty-four-hour glare of the supervised prison. To live entirely in public is a form of solitary confinement.” The Circle offers a new incarnation of the place where there is no darkness.
Eggers’s timing was fortuitous. On June 5, 2013, a few months before he published The Circle, The Guardian and The Washington Post revealed the existence of a massive NSA electronic surveillance programme, using documents leaked by computer engineer Edward Snowden. Snowden later said that Orwell “warned us of the danger of this kind of information” but Oceania’s surveillance apparatus was “nothing compared to what we have available today.” As President Obama defended the NSA from Big Brother comparisons, Senator Bernie Sanders called it “very Orwellian,” and The New Yorker asked, “So, Are We Living in 1984?,” sales of Nineteen Eighty-Four shot up by several thousand per cent on Amazon, itself a data-hungry tech giant.47
George Orwell did not predict the internet (although E. M. Forster arguably did), and had only a rudimentary understanding of technology, yet he had been lurking in the wings of such conversations since the 1980s. Optimists like Nam June Paik, the creator of Good Morning, Mr. Orwell, saw the internet as the unstoppable force that would render tyranny impossible: “So George Orwell was wrong after all, when he wrote 1984.” Peter Huber irreverently rewrote Nineteen Eighty-Four in Orwell’s Revenge: The 1984 Palimpsest to argue that Orwell was “completely, irredeemably, outrageously wrong” about the telescreen because networked communication, such as the nascen
t World Wide Web, would bring about a world in which “the proles do the watching, and the Party is whipped into submission.”
Conversely, the novelist Thomas Pynchon wrote in his foreword to the 2003 edition of Nineteen Eighty-Four that the internet was “a development that promises social control on a scale those quaint old twentieth-century tyrants with their goofy mustaches could only dream about.” The Snowden revelations moved the needle towards Pynchon’s analysis. Optimism about the potential of the internet to hold power to account in the perpetual light of unlimited information was beginning to look foolish.
Nineteen Eighty-Four and Brave New World used to be seen as mutually exclusive dystopias. In 1984, however, while Neil Postman was writing Amusing Ourselves to Death, Aldous Huxley’s biographer Sybille Bedford came to a different conclusion, describing the choice as a false binary: “We have entered the age of mixed tyrannies.” By this she meant that the modern power-seeker would assemble whatever combination of coercion, seduction and distraction proved most effective.
The Ministry of Truth Page 31