Out on a Limb
Page 45
And all the elites who stood in his way? Crippled by their own failures, demoralized by their crumbling stature, they first mock and then cave. As one lone journalist laments before the election (he finds himself in a concentration camp afterward): “I’ve got to keep remembering… that Windrip is only the lightest cork on the whirlpool. He didn’t plot all this thing. With all the justified discontent there is against the smart politicians and the Plush Horses of Plutocracy—oh, if it hadn’t been one Windrip, it’d been another.… We had it coming, we Respectables.”
And, eighty-one years later, many of us did. An American elite that has presided over massive and increasing public debt, that failed to prevent 9/11, that chose a disastrous war in the Middle East, that allowed financial markets to nearly destroy the global economy, and that is now so bitterly divided the Congress is effectively moot in a constitutional democracy: “we Respectables” deserve a comeuppance. The vital and valid lesson of the Trump phenomenon is that if the elites cannot govern by compromise, someone outside will eventually try to govern by popular passion and brute force.
But elites still matter in a democracy. They matter not because they are democracy’s enemy but because they provide the critical ingredient to save democracy from itself. The political Establishment may be battered and demoralized, deferential to the algorithms of the web and to the monosyllables of a gifted demagogue, but this is not the time to give up on America’s near-unique and stabilizing blend of democracy and elite responsibility. The country has endured far harsher times than the present without succumbing to rank demagoguery; it avoided the fascism that destroyed Europe; it has channeled extraordinary outpourings of democratic energy into constitutional order. It seems shocking to argue that we need elites in this democratic age—especially with vast inequalities of wealth and elite failures all around us. But we need them precisely to protect this precious democracy from its own destabilizing excesses.
And so those Democrats who are gleefully predicting a Clinton landslide in November need to both check their complacency and understand that the Trump question really isn’t a cause for partisan schadenfreude anymore. It’s much more dangerous than that. Those still backing the demagogue of the left, Bernie Sanders, might want to reflect that their critique of Clinton’s experience and expertise—and their facile conflation of that with corruption—is only playing into Trump’s hands. That it will fall to Clinton to temper her party’s ambitions will be uncomfortable to watch, since her willingness to compromise and equivocate is precisely what many Americans find so distrustful. And yet she may soon be all we have left to counter the threat. She needs to grasp the lethality of her foe, moderate the kind of identity politics that unwittingly empowers him, make an unapologetic case that experience and moderation are not vices, address much more directly the anxieties of the white working class—and Democrats must listen.
More to the point, those Republicans desperately trying to use the long-standing rules of their own nominating process to thwart this monster deserve our passionate support, not our disdain. This is not the moment to remind them that they partly brought this on themselves. This is a moment to offer solidarity, especially as the odds are increasingly stacked against them. Ted Cruz and John Kasich face their decisive battle in Indiana on May 3. But they need to fight on, with any tactic at hand, all the way to the bitter end. The Republican delegates who are trying to protect their party from the whims of an outsider demagogue are, at this moment, doing what they ought to be doing to prevent civil and racial unrest, an international conflict, and a constitutional crisis. These GOP elites have every right to deploy whatever rules or procedural roadblocks they can muster, and they should refuse to be intimidated.
And if they fail in Indiana or Cleveland, as they likely will, they need, quite simply, to disown their party’s candidate. They should resist any temptation to loyally back the nominee or to sit this election out. They must take the fight to Trump at every opportunity, unite with Democrats and Independents against him, and be prepared to sacrifice one election in order to save their party and their country.
For Trump is not just a wacky politician of the far right, or a riveting TV spectacle, or a Twitter phenom and bizarre working-class hero. He is not just another candidate to be parsed and analyzed by TV pundits in the same breath as all the others. In terms of our liberal democracy and constitutional order, Trump is an extinction-level event. It’s long past time we started treating him as such.
I Used to Be a Human Being
September 19, 2016 | NEW YORK magazine
I was sitting in a large meditation hall in a converted novitiate in central Massachusetts when I reached into my pocket for my iPhone. A woman in the front of the room gamely held a basket in front of her, beaming beneficently, like a priest with a collection plate. I duly surrendered my little device, only to feel a sudden pang of panic on my way back to my seat. If it hadn’t been for everyone staring at me, I might have turned around immediately and asked for it back. But I didn’t. I knew why I’d come here.
A year before, like many addicts, I had sensed a personal crash coming. For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every twenty minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. And at times, as events took over, I’d spend weeks manically grabbing every tiny scrap of a developing story in order to fuse them into a narrative in real time. I was in an unending dialogue with readers who were caviling, praising, booing, correcting. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long.
I was, in other words, a very early adopter of what we might now call living in the web. And as the years went by, I realized I was no longer alone. Facebook soon gave everyone the equivalent of their own blog and their own audience. More and more people got a smartphone—connecting them instantly to a deluge of febrile content, forcing them to cull and absorb and assimilate the online torrent as relentlessly as I had once. Twitter emerged as a form of instant blogging of microthoughts. Users were as addicted to the feedback as I had long been—and even more prolific. Then the apps descended, like the rain, to inundate what was left of our free time. It was ubiquitous now, this virtual living, this never stopping, this always updating. I remember when I decided to raise the ante on my blog in 2007 and update every half hour or so and my editor looked at me as if I were insane. But the insanity was now banality; the once-unimaginable pace of the professional blogger was now the default for everyone.
If the internet killed you, I used to joke, then I would be the first to find out. Years later, the joke was running thin. In the last year of my blogging life, my health began to give out. Four bronchial infections in twelve months had become progressively harder to kick. Vacations, such as they were, had become mere opportunities for sleep. My dreams were filled with the snippets of code I used each day to update the site. My friendships had atrophied as my time away from the web dwindled. My doctor, dispensing one more course of antibiotics, finally laid it on the line: “Did you really survive HIV to die of the web?”
But the rewards were many: an audience of up to one hundred thousand people a day; a new-media business that was actually profitable; a constant stream of things to annoy, enlighten, or infuriate me; a niche in the nerve center of the exploding global conversation; and a way to measure success—in big and beautiful data—that was a constant dopamine bath for the writerly ego. If you had to reinvent yourself as a writer in the internet age, I reassured myself, then I was ahead of the curve. The problem was that I hadn’t been able to reinvent myself as a human
being.
I tried reading books, but that skill now began to elude me. After a couple of pages, my fingers twitched for a keyboard. I tried meditation, but my mind bucked and bridled as I tried to still it. I got a steady workout routine, and it gave me the only relief I could measure for an hour or so a day. But over time in this pervasive virtual world, the online clamor grew louder and louder. Although I spent hours each day, alone and silent, attached to a laptop, it felt as if I were in a constant cacophonous crowd of words and images, sounds and ideas, emotions and tirades—a wind tunnel of deafening, deadening noise. So much of it was irresistible, as I fully understood. So much of the technology was irreversible, as I also knew. But I’d begun to fear that this new way of living was actually becoming a way of not living.
By the last few months, I realized I had been engaging—like most addicts—in a form of denial. I’d long treated my online life as a supplement to my real life, an add-on, as it were. Yes, I spent many hours communicating with others as a disembodied voice, but my real life and body were still here. But then I began to realize, as my health and happiness deteriorated, that this was not a both-and kind of situation. It was either-or. Every hour I spent online was not spent in the physical world. Every minute I was engrossed in a virtual interaction I was not involved in a human encounter. Every second absorbed in some trivia was a second less for any form of reflection, or calm, or spirituality. “Multitasking” was a mirage. This was a zero-sum question. Either I lived as a voice online or I lived as a human being in the world that humans had lived in since the beginning of time.
And so I decided, after fifteen years, to live in reality.
Since the invention of the printing press, every new revolution in information technology has prompted apocalyptic fears. From the panic that easy access to the vernacular English Bible would destroy Christian orthodoxy all the way to the revulsion, in the 1950s, at the barbaric young medium of television, cultural critics have moaned and wailed at every turn. Each shift represented a further fracturing of attention—continuing up to the previously unimaginable kaleidoscope of cable television in the late-twentieth century and the now infinite, infinitely multiplying spaces of the web. And yet society has always managed to adapt and adjust, without obvious damage, and with some more-than-obvious progress. So it’s perhaps too easy to view this new era of mass distraction as something newly dystopian.
But it sure does represent a huge leap from even the very recent past. The data bewilder. Every single minute on the planet, YouTube users upload four hundred hours of video and Tinder users swipe profiles over a million times. Each day, there are literally billions of Facebook “likes.” Online outlets now publish exponentially more material than they once did, churning out articles at a rapid-fire pace, adding new details to the news every few minutes. Blogs, Facebook feeds, Tumblr accounts, tweets, and propaganda outlets repurpose, borrow, and add top spin to the same output.
We absorb this content (as writing or video or photography is now called) no longer primarily by buying a magazine or paper, by bookmarking our favorite website, or by actively choosing to read or watch. We are instead guided to these info nuggets by myriad little interruptions on social media, all cascading at us with individually tailored relevance and accuracy. Do not flatter yourself in thinking that you have much control over which temptations you click on. Silicon Valley’s technologists and their ever-perfecting algorithms have discovered the form of bait that will have you jumping like a witless minnow. No information technology ever had this depth of knowledge of its consumers—or greater capacity to tweak their synapses to keep them engaged.
And the engagement never ends. Not long ago, surfing the web, however addictive, was a stationary activity. At your desk at work, or at home on your laptop, you disappeared down a rabbit hole of links and resurfaced minutes (or hours) later to reencounter the world. But the smartphone then went and made the rabbit hole portable, inviting us to get lost in it anywhere, at any time, whatever else we might be doing. Information soon penetrated every waking moment of our lives.
And it did so with staggering swiftness. We almost forget that ten years ago there were no smartphones and as recently as 2011 only a third of Americans owned one. Now nearly two-thirds do. That figure reaches 85 percent when you’re only counting young adults. And 46 percent of Americans told Pew surveyors last year a simple but remarkable thing: they could not live without one. The device went from unknown to indispensable in less than a decade. The handful of spaces where it was once impossible to be connected—the airplane, the subway, the wilderness—are dwindling fast. Even hiker backpacks now come fitted with battery power for smartphones. Perhaps the only “safe space” that still exists is the shower.
Am I exaggerating? A small but detailed 2015 study of young adults found that participants were using their phones five hours a day, at eighty-five separate times. Most of these interactions were for less than thirty seconds, but they add up. Just as revealing: the users weren’t fully aware of how addicted they were. They thought they picked up their phones half as much as they actually did. But whether they were aware of it or not, a new technology had seized control of around one-third of these young adults’ waking hours.
The interruptions often feel pleasant, of course, because they are usually the work of your friends. Distractions arrive in your brain connected to people you know (or think you know), which is the genius of social, peer-to-peer media. Since our earliest evolution, humans have been unusually passionate about gossip, which some attribute to the need to stay abreast of news among friends and family as our social networks expanded. We were hooked on information as eagerly as sugar. And give us access to gossip the way modernity has given us access to sugar and we have an uncontrollable impulse to binge. A regular teen Snapchat user, as The Atlantic recently noted, can have exchanged anywhere between ten thousand and even as many as four hundred thousand snaps with friends. As the snaps accumulate, they generate publicly displayed scores that bestow the allure of popularity and social status. This, evolutionary psychologists will attest, is fatal. When provided a constant source of information and news and gossip about one another—routed through our social networks—we are close to helpless.
Just look around you—at the people crouched over their phones as they walk the streets, or drive their cars, or walk their dogs, or play with their children. Observe yourself in line for coffee, or in a quick work break, or driving, or even just going to the bathroom. Visit an airport and see the sea of craned necks and dead eyes. We have gone from looking up and around to constantly looking down.
If an alien had visited America just five years ago, then returned today, wouldn’t this be its immediate observation? That this species has developed an extraordinary new habit—and, everywhere you look, lives constantly in its thrall?
I arrived at the meditation retreat center a few months after I’d quit the web, throwing my life and career up in the air. I figured it would be the ultimate detox. And I wasn’t wrong. After a few hours of silence, you tend to expect some kind of disturbance, some flurry to catch your interest. And then it never comes. The quiet deepens into an enveloping default. No one spoke; no one even looked another person in the eye—what some Buddhists call noble silence. The day was scheduled down to the minute, so that almost all our time was spent in silent meditation with our eyes closed, or in slow-walking meditation on the marked trails of the forest, or in communal, unspeaking meals. The only words I heard or read for ten days were in three counseling sessions, two guided meditations, and nightly talks on mindfulness.
I’d spent the previous nine months honing my meditation practice, but, in this crowd I was a novice and a tourist. (Everyone around me was attending six-week or three-month sessions.) The silence, it became apparent, was an integral part of these people’s lives—and their simple manner of movement, the way they glided rather than walked, the open expressions on their faces, all fascinated me. What were they experiencing, if not insane level
s of boredom?
And how did their calm somehow magnify itself when I was surrounded by them every day? Usually, when you add people to a room, the noise grows; here it was the silence that seemed to compound itself. Attached to my phone, I had been accompanied for so long by verbal and visual noise, by an endless bombardment of words and images, and yet I felt curiously isolated. Among these meditators, I was alone in silence and darkness, yet I felt almost at one with them. My breathing slowed. My brain settled. My body became much more available to me. I could feel it digesting and sniffing, itching and pulsating. It was if my brain were moving away from the abstract and the distant toward the tangible and the near.
Things that usually escaped me began to intrigue me. On a meditative walk through the forest on my second day, I began to notice not just the quality of the autumnal light through the leaves but the splotchy multicolors of the newly fallen, the texture of the lichen on the bark, the way in which tree roots had come to entangle and overcome old stone walls. The immediate impulse—to grab my phone and photograph it—was foiled by an empty pocket. So I simply looked. At one point, I got lost and had to rely on my sense of direction to find my way back. I heard birdsong for the first time in years. Well, of course, I had always heard it, but it had been so long since I listened.
My goal was to keep thought in its place. “Remember,” my friend Sam Harris, an atheist meditator, had told me before I left, “if you’re suffering, you’re thinking.” The task was not to silence everything within my addled brain, but to introduce it to quiet, to perspective, to the fallow spaces I had once known where the mind and soul replenish.
Soon enough, the world of “the news,” and the raging primary campaign, disappeared from my consciousness. My mind drifted to a trancelike documentary I had watched years before, Philip Gröning’s Into Great Silence, on an ancient Carthusian monastery and silent monastic order in the Alps. In one scene, a novice monk is tending his plot of garden. As he moves deliberately from one task to the next, he seems almost in another dimension. He is walking from one trench to another, but never appears focused on actually getting anywhere. He seems to float, or mindfully glide, from one place to the next.