Book Read Free

These Truths

Page 86

by Jill Lepore


  Jones, wild with malice, cut through the American political imagination with a chainsaw rigged to a broom handle, flailing and gnashing. In 2008, when Barack Obama sought the Democratic nomination for president in a close competition with Hillary Clinton, Jones and other truthers became birthers: they argued that Obama, who was born in Hawaii—an event reported in two Hawaiian newspapers and recorded on his birth certificate—had been born in Kenya. The truthers were on the far fringes, but even the broader American public raised an eyebrow at Obama’s name, Barack Hussein Obama, at a time when the United States’s declared enemies were Osama bin Laden and Saddam Hussein. Urged to change his name, Obama refused. Instead, he joked about it. “People call me ‘Alabama,’” he’d say on the campaign trail. “They call me ‘Yo Mama.’ And that’s my supporters!”19

  So far from changing his name, Obama made his story his signature. His 2008 campaign for “Hope” and “Change” was lifted by soaring storytelling about the nation’s long march to freedom and equality in which he used his own life as an allegory for American history, in the tradition of Benjamin Franklin, Andrew Jackson, and Frederick Douglass. But Obama’s story was new. “I am the son of a black man from Kenya and a white woman from Kansas,” he said. “These people are a part of me. And they are a part of America.” Obama’s American family was every color, and part of a very big world. “I have brothers, sisters, nieces, nephews, uncles and cousins, of every race and every hue, scattered across three continents, and for as long as I live, I will never forget that in no other country on Earth is my story even possible.”20

  Obama’s election as the United States’ first black president was made possible by centuries of black struggle, by runaways and rebellions, by war and exile, by marches and court cases, by staggering sacrifices. “Barack Obama is what comes at the end of that bridge in Selma,” said the much-admired man who had marched at Selma, John Lewis.21 His victory seemed to usher in a new era in American history, a casting off of the nation’s agonizing legacy of racial violence, the realizing, at long last, of the promises made in the nation’s founding documents. Yet as he took office in 2009, Obama inherited a democracy in disarray. The United States was engaged in two distant wars with little popular support and few achievable objectives, fought by a military drawn disproportionately from the poor—as if they were drones operated by richer men. The economy had collapsed in one of the worst stock market crashes in American history. The working class had seen no increase in wages for more than a generation. One in three black men between the ages of twenty and twenty-nine was in prison or on probation.22 Both parties had grown hollow—hard and partisan on the outside, empty on the inside—while political debate, newly waged almost entirely online, had become frantic, desperate, and paranoid. Between 1958 and 2015, the proportion of Americans who told pollsters that they “basically trust the government” fell from 73 percent to 19 percent.23 Forty years of a relentless conservative attack on the government and the press had produced a public that trusted neither. Forty years of identity politics had shattered Rooseveltian liberalism; Obama walked on shards of glass.

  Even as Obama embraced a family of cousins scattered across continents, nationalism and even white supremacy were growing in both the United States and Europe in the form of populist movements that called for immigration restriction, trade barriers, and, in some cases, the abdication of international climate accords. New movements emerged from the right—the Tea Party in 2009 and the alt-right in 2010—and from the left: Occupy in 2011, Black Lives Matter in 2013. Activists on the left, including those aligned with an antifascist resistance known as antifa, self-consciously cast their campaigns as international movements, but the new American populism and a resurgent white nationalism had their counterparts in other countries, too. Whatever their political differences, they shared a political style. In a time of accelerating change, both the Far Left and the Far Right came to understand history itself as a plot, an understanding advanced by the very formlessness of the Internet, anonymous and impatient. Online, the universe appeared to be nothing so much as an array of patterns in search of an explanation, provided to people unwilling to trust to any authority but that of their own fevered, reckless, and thrill-seeking political imaginations.

  In 2011, during Obama’s second term, the aging New York businessman, television star, and on-again, off-again presidential candidate Donald Trump aligned himself with the truthers and the birthers by questioning the president’s citizenship. In a country where the Supreme Court had ruled, in Dred Scott, that no person born of African descent could ever be an American citizen, to say that Obama was not a citizen was to call upon centuries of racial hatred. Like 9/11 conspiracy theorists, Obama conspiracy theorists (who were in many cases the same people) were forever adding details to their story: the president was born in Nairobi; he was educated at a madrasa in Jakarta; he was secretly a Muslim; he was, still more secretly, an anti-imperialist African nationalist, like his father; he was on a mission to make America African.24 “The most powerful country in the world,” right-wing pundit Dinesh D’Souza warned, “is being governed according to the dreams of a Luo tribesman of the 1950s.”25

  Trump, bypassing newspapers and television and broadcasting directly to his supporters, waged this campaign online, through his Twitter account. “An ‘extremely credible source’ has called my office and told me that @BarackObama’s birth certificate is a fraud,” he tweeted in 2012.26 Trump did not back off this claim as he pursued the Republican nomination in 2015.27 The backbone of his campaign was a promise to build a wall along the U.S.-Mexican border. After 9/11, a white nationalist movement that had foundered for decades had begun to revive, in pursuit of two goals: preserving the icons of the Confederacy, and ending the immigration of dark-skinned peoples.28 Trump, announcing his candidacy from New York’s Trump Tower, gave a speech in which he called Mexicans trying to enter the United States “rapists,” borrowing from a book by Ann Coulter called ¡Adios, America!29 (On immigration and much else, Coulter promoted herself as a courageous teller of truths in a world of lies. “Every single elite group in America is aligned against the public—the media, ethnic activists, big campaign donors, Wall Street, multimillionaire farmers, and liberal ‘churches,’” Coulter wrote. “The media lie about everything, but immigration constitutes their finest hour of collective lying.”)30 Obama had promised hope and change. Trump promised to Make America Great Again.

  Hillary Clinton, having lost the Democratic nomination to Obama in 2008, won it in 2016 and hoped to become the first female president. Her campaign misjudged Trump and not only failed to address the suffering of blue-collar voters but also insulted Trump’s supporters, dismissing half of them as a “basket of deplorables.” Mitt Romney had done much the same thing as the Republican nominee in 2012, when, with seething contempt, he dismissed the “47 percent” of the U.S. population—Obama’s supporters—as people “who believe they are victims.”31 Party politics had so far abandoned any sense of a national purpose that, within the space of four years, each of the party’s presidential nominees declared large portions of the population of the United States unworthy of their attention and beneath their contempt.

  Trump, having secured the nomination, campaigned against Clinton, aided by the UK data firm Cambridge Analytica, by arguing that she belonged in jail. “She is an abject, psychopathic, demon from Hell that as soon as she gets into power is going to try to destroy the planet,” said Jones, who sold “Hillary for Prison” T-shirts. “Lock Her Up,” Trump’s supporters said at his rallies.32

  American history became, in those years, a wound that bled, and bled again. Gains made toward realizing the promise of the Constitution were lost. Time seemed to be moving both backward and forward. Americans fought over matters of justice, rights, freedom, and America’s place in the world with a bitter viciousness, and not only online. Each of the truths on which the nation was founded and for which so many people had fought was questioned. The idea of truth itself
was challenged. The only agreed-upon truth appeared to be a belief in the ubiquity of deception. The 2008 Obama campaign assembled a Truth Team.33 “You lie!” a South Carolina congressman called out to President Obama, during a joint session of Congress in 2011. “You are fake news!” Trump said to a CNN reporter at an event at the White House.34

  “Let facts be submitted to a candid world,” Jefferson had written in the Declaration of Independence, founding a nation by appealing to truth. But whatever had been left of a politics of reasoned debate, of inquiry and curiosity, of evidence and fair-mindedness, seemed to have been eradicated when, on December 2, 2015, Trump appeared on Infowars by Skype from Trump Tower. In an earlier campaign rally, Trump had said that on 9/11 he’d been watching television from his penthouse, and had seen footage of “thousands and thousands of people,” Muslims, cheering from rooftops in New Jersey.35 Jones began by congratulating Trump on being vindicated on this point. (Trump had not, in fact, been vindicated, and no such footage has ever been found.) Jones, sputtering, gushed about the historic nature of Trump’s campaign.

  “What you’re doing is epic,” Jones told Trump. “It’s George Washington level.”

  “Your reputation’s amazing,” Trump told Jones, promising, “I will not let you down.”36

  Five days later, Trump called for a “total and complete shutdown of the entry of Muslims to the United States.”37 In place of towers, there would be walls.

  Between the attacks on September 11, 2001, and the election of Donald Trump fifteen years later, on November 9, 2016, the United States lost its way in a cloud of smoke. The party system crashed, the press crumbled, and all three branches of government imploded. There was real fear that the American political process was being run by Russians, as if, somehow, the Soviets had won the Cold War after all. To observers who included the authors of books like How Democracy Ends, Why Liberalism Failed, How the Right Lost Its Mind, and How Democracies Die, it seemed, as Trump took office, as if the nation might break out in a civil war, as if the American experiment had failed, as if democracy itself were in danger of dying.38

  I.

  IT BEGAN, in the year 1999, with a panic. Computer programmers predicted that at one second after midnight on January 1, 2000, all the world’s computers, unable to accommodate a year that did not begin with “19,” would crash. Even before the twenty-first century began, even before no small number of political dystopians forecast a thousand-year clash of civilizations or the imminent death of democracy, Americans were subjected to breathless warnings of millennial doom, a ticking clock catastrophe, not the global annihilation timed by the atomic age’s Doomsday Clock but a disaster, a “Y2K bug,” embedded into the programs written to run on the microprocessor tucked into the motherboard of the hulking computer perched on every desktop. After much rending of garments and gnashing of teeth, this bug was quietly and entirely fixed. The end of the world averted, digital prophets next undertook to predict the exact date of the arrival of an Aquarian age of peace, unity, and harmony, ushered in by, of all things, the Internet.

  Wired magazine began appearing in 1993 and by 2000 announced that the Internet had ushered in “One Nation, Interconnected.” In the spring of 2000, Wired, the slick, punk, Day-Glo magazine of the dot-com era, announced that the Internet had, in fact, already healed a divided America: “We are, as a nation, better educated, more tolerant, and more connected because of—not in spite of—the convergence of the Internet and public life. Partisanship, religion, geography, race, gender, and other traditional political divisions are giving way to a new standard—wiredness—as an organizing principle for political and social attitudes.”39 Of all the wide-eyed technological boosterism in American history, from the telegraph to the radio, few pronouncements rose to such dizzying rhetorical heights.

  Over the course of the twentieth century, the United States had assumed an unrivaled position in the world as the defender of liberal states, democratic values, and the rule of law. From NATO to NAFTA, relations between states had been regulated by pacts, free trade agreements, and restraint. But, beginning in 2001, with the war on terror, the United States undermined and even abdicated the very rules it had helped to establish, including prohibitions on torture and wars of aggression.40 By 2016, a “by any means necessary” disregard for restraints on conduct had come to characterize American domestic politics as well. “If you see somebody getting ready to throw a tomato,” Trump told supporters at a campaign rally in Iowa, “knock the crap out of them, would you?”41 Countless factors contributed to these changes. But the crisis of American moral authority that began with the war on terror at the start of the twenty-first century cannot be understood outside of the rise of the Internet, which is everything a rule-based order is not: lawless, unregulated, and unaccountable.

  What became the Internet had begun in the late 1960s, with ARPANET. By the mid-1970s, the Department of Defense’s Advanced Research Projects Agency’s network had grown to an international network of networks: an “internet,” for short. In 1989, in Geneva, Tim Berners-Lee, an English computer scientist, proposed a protocol to link pages on what he called the World Wide Web. The first web page in the United States was created in 1991, at Stanford. Berners-Lee’s elegant protocol spread fast, first across universities and then to the public. The first widely available web browser, Mosaic, was launched in 1993, making it possible for anyone with a personal computer wired to the Internet to navigate web pages around the world, click by astonishing click.42

  Wired, launched in March 1993, flaunted cyberculture’s countercultural origins. Its early contributors included Stewart Brand and John Perry Barlow, a gold-necklace- and scarf-wearing bearded mystic who for many years wrote lyrics for the Grateful Dead. In Wired, the counterculture’s dream of a nonhierarchical, nonorganizational world of harmony found expression in a new digital utopianism, as if every Internet cable were a string of love beads. Brand, writing in an article in Time, “We Owe It All to the Hippies,” announced that “the real legacy of the sixties generation is the computer revolution.”43

  But between the 1960s and the 1990s, the revolution had moved from the far left to the far right. Wired was edited by Louis Rossetto, a libertarian and former anarchist known to lament the influence of the “mainstream media.” In the magazine’s inaugural issue, Rossetto predicted that the Internet would bring about “social changes so profound their only parallel is probably the discovery of fire.” The Internet would create a new, new world order, except it wouldn’t be an order; it would be an open market, free of all government interference, a frontier, a Wild West. In 1990, Barlow had helped found the Electronic Frontier Foundation, to promote this vision. (The EFF later became chiefly concerned with matters of intellectual property, free speech, and privacy.) In 1993, Wired announced that “life in cyberspace seems to be shaping up exactly like Thomas Jefferson would have wanted: founded on the primacy of individual liberty and a commitment to pluralism, diversity and community.”44

  The digital utopians’ think tank was Newt Gingrich’s Progress and Freedom Foundation, established in 1993 (and later the subject of an ethics inquiry); its key thinker was an irrepressible George Gilder, resurrected. Gingrich appeared on the cover of Wired in 1995, Gilder in 1996. Gingrich was battling in Congress for a new Telecommunications Act, the first major revision of the FCC-founding 1934 Federal Communications Act (itself a revision of the 1927 Federal Radio Act); his objective was to insure that, unlike radio or television, the new medium would lie beyond the realm of government regulation. At a 1994 meeting of Gingrich’s Progress and Freedom Foundation in Aspen, Gilder, along with futurists Alvin Toffler and Esther Dyson and the physicist George Keyworth, Reagan’s former science adviser, drafted a “Magna Carta for the Information Age.”45 It established the framework of the act Gingrich hoped to pass. Announcing that “cyberspace is the latest American frontier,” the writers of the new Magna Carta contended that while the industrial age might have required government regulation, the k
nowledge age did not. “If there is to be an ‘industrial policy for the knowledge age,’” their Magna Carta proclaimed, “it should focus on removing barriers to competition and massively deregulating the fast-growing telecommunications and computing industries.”46

  Gingrich got his wish. On February 8, 1996, in an event broadcast live and over the Internet, Bill Clinton signed the Telecommunications Act in the reading room of the Library of Congress; he signed on paper and he also signed online, at a computer terminal.47 If little noticed at the time, Clinton’s approval of this startling piece of legislation would prove a lasting and terrible legacy of his presidency: it deregulated the communications industry, lifting virtually all of its New Deal antimonopoly provisions, allowing for the subsequent consolidation of media companies and prohibiting regulation of the Internet with catastrophic consequences.

  Nevertheless, that the U.S. government would even presume to legislate the Internet—even if only to promise not to regulate it—alarmed the Internet libertarians. On the day Clinton signed the bill, Barlow, ex-hippie become the darling of world bankers and billionaires, watching from the World Economic Forum in Davos, Switzerland, wrote a Declaration of Independence of Cyberspace:

  Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather. . . . Governments derive their just powers from the consent of the governed. You have neither solicited nor received ours. We did not invite you. You do not know us, nor do you know our world. Cyberspace does not lie within your borders.48

 

‹ Prev