Present Shock: When Everything Happens Now
Page 8
And this struggle to be in more than one place at the same time leads to the next main type of present shock.
DIGIPHRENIA
BREAKING UP IS HARD TO DO
Things are too busy. “An Over-Proximity of All Things.” I’ve been fundraising. Negotiating. Flying. Every week, two plane trips, two destinations. It is all at once. No time, no time.
—SMS to me from cyborg anthropologist Amber Case
You know you’re in trouble when even Google thinks you have exceeded the human capacity to exist in more than one place at a time.
It was morning, Europe-time, and I was sitting in the lobby of a hotel in a remote suburb of Berlin, jet-lagged and disoriented, unable to check in to my room until later that afternoon. Temporal limbo. But the Internet never sleeps and the lobby had WiFi, so I figured I’d get some work done. I had neglected my email for the fourteen hours it took to get from outside New York to outside Berlin, and there were already second and third pings in the inbox from people wondering why they hadn’t yet received a reply to their initial missives. (Teaches me right for answering emails so quickly and creating unrealistic expectations.)
Not that I hadn’t made a valiant effort to connect while in transit. The plane had some sort of WiFi while we were over land, but someone kept trying to make Skype calls, which crashed all the passengers’ connections. After the flight attendant restarted the system for the third time, I gave up, took a pill, and fell asleep to the movie. I tried again from my iPhone at the airport but couldn’t log on. Now that I was at the hotel with plug-in power, I knew I’d have an easier time catching up with all that was waiting for me.
The emails with the most little red exclamation points next to them came from an executive in New York desperate to find a speaker for an event about Internet ethics. I knew I had a talk in Missouri on the day following his event but thought that maybe I could do the New York talk in the morning and still make my flight to Missouri that afternoon. I just had to log in to my Google Calendar to make sure. Google said I was using an unfamiliar IP address, so it wanted to ask a few questions to make sure I was really me. My childhood pet, my city of birth . . . that’s when my thirty minutes of connectivity ran out. I went to the front desk for another little slip of paper with a password on it and got back online and back to the place where I needed to prove my identity to Google. Except now Google was really upset with me. It said I appeared to be attempting to access the site from too many locations at once. As a precaution, Google would lock my account until it could verify I was really me. Google wanted to send an automatic text message to my cell phone so I could authenticate my identity—but my cell phone did not work in Germany.
So in the haze of that rainy German morning, I foolishly decided to keep planning my digital-era schedule without access to my digital-era calendar. I replied that I would accept the morning talk as long as I could manage to leave before noon to make my flight. And as you may have guessed by now, by the time I got back to New York where I could verify my human identity to my Google media extensions, the calendar had a different version of my future than I did. My talk in Missouri was the same morning as the New York talk—not the morning after. Oops. Present-shock nightmare: I was supposed to be in two places at once. No matter how digitally adept I get, there’s still only one me.
Unlike humans who might treasure their uniqueness, digital media’s most defining quality is its ability to be copied exactly. Digital files, even digital processes, can be multiplied and exist in several instances at once. This is something new. The real-world, analog copies of things we have known up to now are all efforts to remain as true as possible to some original source. Even the movies we watch are struck from an internegative of the movie, which is in turn struck from the original negative. Like successive photocopies, each one contains a bit more noise—but it’s better than playing the original negative repeatedly and wearing out the source. In contrast, digital copies are indistinguishable from the original. They don’t take a toll on the thing being copied, either. In a sense, they are not copies at all. Once created, they are the original.
In the analog world, the original must be preserved, and copying slowly destroys it. Every time the original negative is run through the printing machine, it is further degraded. Flash pictures are prohibited at museums, because each exposure to light slightly dims the brilliance of an original painting. Unlike such photos, digital copies are not impressions made off a master; they are originals themselves—more like clones than children, capable of carrying on new activities as if utterly reborn in every instant.
People are still analog.
Leveraged appropriately, the immense distributive power of digital production and networks gives us the ability to spread our ideas and expressions, as well as our power and influence. We can produce effects in more than one place at a time, each of us now having the global reach formerly reserved for kings, presidents, and movie stars. Our role in our culture and society may have changed from that of passive readers or watchers to that of active game players, but this self-direction comes at a cost. We previously had the luxury of being led through experiences, our one-pointed attention moving along a tightly controlled narrative pathway. In the choose-your-own-adventure landscape of gaming—as well as the culture, workplace, and political realm structured in this new way—we must keep our eyes moving and at the same time stay aware of all sorts of activity on the periphery.
Wherever our real bodies may be, our virtual personae are being bombarded with information and missives. Our inboxes are loading, our Twitter feeds are rolling, our Facebook updates are changing, our calendars are filling, and our consumer profiles and credit reports are adjusting all along the way. As in a game, things we do not act upon don’t wait for us to notice them. Everything is running in parallel, and sometimes from very far away. Timing is everything, and everyone is impatient.
Even though we may be able to be in only one place at a time, our digital selves are distributed across every device, platform, and network onto which we have cloned our virtual identities. The people and programs inhabiting each of these places relate to our digital addresses and profiles as if they were the original. The question is always “Why hasn’t he answered my email?” and never “When will he log on to the Internet and check the particular directory to which my text was copied?”
Our digital devices and the outlooks they inspired allowed us to break free of the often repressive timelines of our storytellers, turning us from creatures led about by future expectations into more fully present-oriented human beings. The actual experience of this now-ness, however, is a bit more distracted, peripheral, even schizophrenic than that of being fully present. For many, the collapse of narrative led initially to a kind of post-traumatic stress disorder—a disillusionment, and the vague unease of having no direction from above, no plan or story. But like a dose of adrenaline or a double shot of espresso, our digital technologies compensate for this goalless drifting with an onslaught of simultaneous demands. We may not know where we’re going anymore, but we’re going to get there a whole lot faster. Yes, we may be in the midst of some great existential crisis, but we’re simply too busy to notice.
We have already heard a great deal from concerned doctors and humanists about the ill effects of living digitally.1 Their findings are on record and deserving of our consideration. While all their warnings may be true to some extent, so, too, were the warnings about automobiles and steam engines, or even the threat that written language and law posed to tribes once unified by spoken stories, and that printed Bibles posed to the authority of the Pope and his priests. The things we use do change us. In order to understand and contend with present shock, we should probably be less immediately concerned with the cause-and-effect consequences of digital activity than with the greater implications and requirements of living in the digital environment. It’s not about how digital technology changes us, but how we change ourselves and one another now that we live so digitally
.
We live in a world informed in large part by digital devices and outlooks, and one of the primary impacts of thinking this way is to assume the rigors of digital time as our own. Our digital universe is always-on, constantly pinging us with latest news, stock quotes, consumer trends, email responses, social gaming updates, Tweets, and more, all pushing their way to our smart phones. There are so many incoming alerts competing for attention that many phones now allow users to swipe downward to reveal a scrollable screen containing nothing but the latest alerts pushed through. Everyone and everything intrudes with the urgency of a switchboard-era telephone operator breaking into a phone call with an emergency message from a relative, or a 1960s news anchor interrupting a television program with a special report about an assassination. Anything we do may be preempted by something else. And, usually, we simply add the interruption onto the list of other things we’re attempting to do at the same time.
All these interruptions, more than simply depleting our cognitive abilities, create the sense that we need to keep up with their impossible pace lest we lose touch with the present. These are live feeds, after all, pinging us almost instantaneously from every corner of the globe. There are video cameras trained on Wall Street and the Western Wall, a tent village in Cairo and a box of schnauzer puppies in a Florida pet shop.
If we could only catch up with the wave of information, we feel, we would at last be in the now. This is a false goal. For not only have our devices outpaced us, they don’t even reflect a here and now that may constitute any legitimate sort of present tense. They are reports from the periphery, of things that happened moments ago. It seems as if to digest and comprehend them in their totality would amount to having reality on tap, as if from a fantastic media control room capable of monitoring everything, everywhere, all at the same time. It’s as if all the Facebook updates, Twitter streams, email messages, and live-streamed video could combine to create a total picture of our true personal status, or that of our business, at any given moment. And there are plenty of companies out there churning all this data in real time in order to present us with metrics and graphs claiming to represent the essence of this reality for us. And even when they work, they are mere snapshots of a moment ago. Our Facebook profile and the social graph that can be derived from it, however intricate, is still just a moment locked in time, a static picture.
This quest for digital omniscience, though understandable, is self-defeating. Most of the information we get at lightning speed is so temporal as to be stale by the time it reaches us. We scramble over the buttons of the car radio in an effort to get to the right station at the right minute-after-the-hour for the traffic report. Yet the report itself warns us to avoid jams that have long since been cleared, while telling us nothing about the one in which we’re currently stuck—one they’ll find out about only if we ourselves call it in to their special number. The irony is that while we’re busily trying to keep up with all this information, the information is trying and failing to keep up with us.
Meanwhile, the extraordinary measures we take to stay abreast of each minuscule change to the data stream end up magnifying the relative importance of these blips to the real scheme of things. Investors trade, politicians respond, and friends judge based on the micromovements of virtual needles. By dividing our attention between our digital extensions, we sacrifice our connection to the truer present in which we are living. The tension between the faux present of digital bombardment and the true now of a coherently living human generates the second kind of present shock, what we’re calling digiphrenia—digi for “digital,” and phrenia for “dissordered condition of mental activity.”
This doesn’t mean we should ignore this digitally mediated reality altogether. For just as we found healthier responses to the fall of narrative than panic and rage, there are ways to engage with digital information that don’t necessarily dissect our consciousness into discrete bits right along with it. Instead of succumbing to the schizophrenic cacophony of divided attention and temporal disconnection, we can program our machines to conform to the pace of our operations, be they our personal rhythms or the cycles of our organizations and business sectors. Computers don’t suffer present shock, people do. For we are the only ones living in time.
TIME IS A TECHNOLOGY
We tend to think of the assault on our temporal sensibilities as a recent phenomenon, something that happened since the advent of computers and cell phones—or at least since the punch clock and shift workers. But as technology and culture theorists have reminded us at each step of the way,2 all this started much, much earlier, and digiphrenia is just the latest stage in a very long and lamented progression. At each of these stages, what it meant to be a human being changed along with however it was—or through whatever it was—we related to time.
Of course, humans once lived without any concept of time at all. In this early, hunter-gatherer existence, information was exchanged physically, either orally or with gestures, in person. People lived in an eternal present, without any notion of before or after, much less history or progress. Things just were. The passage of time was not recorded or measured, but rather experienced in its various cycles. Older, wiser people and tribes became aware not just of the cycles of day and night, but of the moon and even the seasons. Since farming hadn’t yet been invented, however, seasons were not to be anticipated or exploited. Beyond gathering a few nuts as it got colder, there was little we could do to shift or store time; the changes around us were simply enjoyed or endured.
Many religions and mythologies look back longingly on this prehistoric timelessness as a golden age, or Eden. Humanity is seen as a fetus in the womb, at one with Mother Nature.3 False notions of a prehistoric noble savage aside, there is at least some truth to the idea that people lacked the capacity to distinguish themselves from nature, animals, and one another. While living so completely at the mercy of nature was fraught with pain and peril, this existence was also characterized by a holism many media and cultural theorists consider to be lost to us today in a world of dualism, preferences, and hierarchies. As media theorist and Catholic priest Walter Ong put it, “Oral communication unites people in groups. Writing and reading are solitary activities that throw the psyche back on itself. . . . For oral cultures, the cosmos is an ongoing event with man at its center.”4 People living in this oral, timeless civilization saw God, or the gods, in everything around them. While they had to worry about where their next meal was coming from, they felt no pressure to succeed or to progress, to achieve or to improve. They had nowhere to go, since the very notion of a future hadn’t yet been invented. This stasis lasted several thousand years.
Everything changed, finally, in the Axial Age with the invention of text. The word-pictures of hieroglyphic writing were replaced with the more discrete symbols of an alphabet. The progenitor of a more digital style of storage, letters were precise and abstract.Combined together, they gave people a way to represent the mouth noises of oral culture in a lasting artifact. Like a digital file, a spelled word is the same everywhere it goes and does not decay. The simple twenty-two-letter alphabet popularized and democratized writing, giving people a way to record promises, debts, thoughts, and events. The things written down one day could be read and retrieved the next.
Once a line could truly be drawn in something other than sand, the notion of history as a progression became possible. With the invention of text came the ability to draft contracts, which were some of the first documents ever written, and described agreements that endured over time. With contracts came accountability, and some ability to control what lay ahead. The notion of a future was born. Religion, in the oral tradition, came from the mouth of a leader or pharaoh, himself a stand-in for God. Text transformed this passive relationship to God or nature with a contract, or, more precisely, a covenant between people and God. What God demands was no longer a matter of a tyrant’s whim or the randomness of nature, but a set of written commandments. Do this and you will get that.
/> This resonated well with people who were learning agriculture and developing a “reap what you sow” approach to their world. Seeds planted and tended now yield a crop in the future. Scriptural laws obeyed now earn God’s good graces in the future. The world was no longer just an endless churn of cycles, but a place with a past and a future. Time didn’t merely come around; it flowed more like a river, forming a history of all that went before. In the new historical sense of time, one year came after the other. Human beings had a story that could be told—and it was, in the Torah and other written creation myths. Pagan holidays that once celebrated only the cycle of the seasons now celebrated moments in history. The spring equinox and fertility rites became the celebration of the Israelite exodus from Egypt; the solstice became the Hanukkah reclamation of the Temple and, later, the birth of Jesus. Periods in the cycle of nature became moments in the flow of history.5
The new metaphor for time was the calendar. A people was defined and its activities organized by its calendar, its holidays, and its memorials. Calendars tell a culture what matters both secularly and religiously. The time for sacred days was held apart, while time for productivity could be scheduled and even enforced. The calendar carried the double-duty of representing the cyclical nature of the lunar months and solar year while also keeping track of historical time with the passing of each numbered year. There was now a before and an after—a civilization that could measure its progress, compare its bounties from one year to the next, and, most important, try to do better. The great leaning forward had begun. We progressed from what social theorist Jeremy Rifkin called “the Earth’s universe” to “God’s universe,”6 conceiving ourselves as participants in a greater plan and subject to a higher law and an external gauge of our success over time.