Book Read Free

The End of Absence: Reclaiming What We've Lost in a World of Constant Connection

Page 4

by Michael Harris


  What we’re witnessing is a kind of instant complicity, a massive and swift behavior shift. And so it may come to pass that a person disconnected from (or merely disinterested in) this flood of new technologies could, in a few years, be made entirely alien to a younger generation. Step out of a running stream and you cannot reenter the same water. I myself stepped away from video games for a decade and now I’m unable to reenter the fray. The latest games are nonsensical and manic to me, though people older than me—who never stepped out of the stream—are perfectly at ease, even thrilled, with the latest offerings.

  • • • • •

  Wariness about the way our technologies are shaping our thoughts, mutterings about “kids these days” and their gadgets, can be tracked back thousands of years. Most famous among the cranks is kindly Socrates, who was perhaps the first to wonder whether gadgets—in his case letters and writing—could change mental processes.

  In Plato’s Phaedrus, we hear Socrates describing how a king from Egypt called Thamus informed the god Theuth that the phonetic alphabet was not so great a gift. The god was particularly chuffed about this new technology, which he delivered to poor, illiterate humans, bragging that writing would make the memories of Egyptians more powerful and that it would supercharge their wit. King Thamus shrewdly replies:

  O most ingenious Theuth . . . this discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing.

  Was there ever a finer description of Google? “An aid not to memory, but to reminiscence.” Real memory and the magic trick of reminiscence, of course, are not the same thing at all. We reminisce when something external recalls the memory for us. Unlike our hard-won memories, a reminiscence is easy, passive, and provided by some reminder. But the user of a technology that recollects on his or her behalf (a scroll, say, stuffed with important dates and names) is not likely to care about that subtle distinction. Kids these days, for Socrates, were rotting their brains by abandoning the oral tradition.

  Several millennia later, in the fifteenth century, the fantastically named Venetian editor Hieronimo Squarciafico looked around at “kids these days” and groaned that the advent of book publishing would lead to intellectual laziness. Men would become less studious when material became so cheap to produce and so whorishly available. The mind would turn to mush. The Florentine book merchant Vespasiano da Bisticci backed Squarciafico up, saying that a printed book should be “ashamed” in the company of a handmade manuscript (one wonders whether his disdain was motivated more by business concerns than aesthetics).

  Our modern, studious commitment to the technologies of writing and printing, then, is a startling departure from the experience of our ancestors. Those men weren’t wrong to be suspicious; something fundamental had been changed. Here’s Berkeley psychologist Alison Gopnik, describing how the act of reading reshaped our brains long before the Internet got its hands on them:

  Cortical areas that once were devoted to vision and speech have been hijacked by print. Instead of learning through practice and apprenticeship, I’ve become dependent on lectures and textbooks. And look at the toll of dyslexia and attention disorders and learning disabilities—all signs that our brains were not designed to deal with such a profoundly unnatural technology.

  Our devotion to reading feels wholesome, natural, but is in fact a wonderful kind of brainwashing. Marshall McLuhan, having fewer brain scans in his arsenal than Gopnik, speaks in more obscure terms when analyzing the fallout of the printing press. For him, printed words became a gravitational force, something our minds reorganized around. “For the most obvious character of print,” he notes, “is repetition, just as the obvious effect of repetition is hypnosis or obsession.” For McLuhan, Stephen King novels and ingredients lists on cereal boxes and the words you’re reading right now are all conspiring to make you think this strange attention you’re paying to tiny lines of printed symbols is a natural act. But it’s not. The intensely myopic attention that the act of poring over a book requires of us is anything but natural, and it reshaped our attitude toward the world at large, bringing about—according to McLuhan—the dawn of capitalism, the regulation of language, and the dominance of the visual at the expense of our multisensory lives: “The eye speeded up and the voice quieted down.” He attributes the bulk of our “shrill and expansive individualism” to Gutenberg’s invention.

  After the arrival of mass-produced books, we became “typographical man,” and our voices lost some power. We were encouraged by the technologies of writing and printing to take on some kinds of input and discouraged from taking on others. Today we privilege the information we take in through our eyes while reading and pay less heed to information that arrives via our other senses. In plainest terms, McLuhan delivers his famous line: “The medium is the message.” What you use to interact with the world changes the way you see the world. Every lens is a tinted lens.

  • • • • •

  A latter-day King Thamus or Squarciafico would grumble at me for using my phone to call up my partner’s number. In fact, I’ve never known Kenny’s number by heart. But it’s not something I worry about or seek to fix. Likewise, if adults in 2064 manage to entirely outsource their memories to digital aids, they won’t begrudge their situation at all, but will rejoice in their mental freedom. How many of us long for more things to store in our brains? Indeed, the value of doing things the hard way becomes a question of “things you never knew you never knew,” to steal a line from Disney’s Pocahontas. I don’t know what satisfaction I might gain from carrying that information in my brain instead, just as my child will never know the value of learning to read a map without GPS. And neither of us will think to care. This is the problem with losing lack: It’s nearly impossible to recall its value once it’s gone.

  Which is why the ancients all cry out in their turn: “Kids these days!” Youths, and the technologies that inform their sensibilities, will always be at odds with the dying techno-sensibility that informed the character of their elders. Yet the scale of discordance in contemporary culture is perhaps an unprecedented thing. Digital natives are subject to a violent removal from the habits of their parents, a shift that will leave them quite alien to those only one generation older, and vice versa.

  When I pause to consider that last remark, I see I’m being way too conservative. There’s actually a chasm between me and folk five years younger. The other day, I was speaking with a young friend of mine—a journalist in his late twenties—and he thought nothing of carrying on a text conversation with someone else while speaking with me. (I am, trust me, painfully aware of being transformed into the kind of man people call “crotchety” here.) It’s a common annoyance, barely worth noting, except that I’d been thinking about what it meant to be constantly put on hold by a person I’m sharing a beer with. It seemed to me that 80 percent of his attention procured 20 percent of my interest. It’s a case of compound distraction. But the really gruesome thing was that he didn’t notice or care that we were both so disengaged. The “natural” attention of someone just a few years younger than me is vastly more kinetic and fractured—attention span has evolved.

  Just how insidious is our difference in attitude? How violent is the change between one mental state and the next?

  • • • • •

  The brains our children are born with are not substantively different from the brains our ancestors were born with forty thousand years ago. For all the wild variety of our cultures, personalities, and thought patterns, we’re all still operating with roughly the same three-pound lump of gray jelly. But almost from day one, the allotment of those brains (and therefor
e the way they function) is different today from the way it was even one generation ago. Every second of your lived experience represents new connections among the roughly eighty-six billion neurons packed inside your brain. Every minute you spend in the particular world that you were born into makes you massively, and functionally, different from those who came before. Children, then, can become literally incapable of thinking and feeling the way their grandparents did. A slower, less harried way of thinking may be on the verge of extinction.

  To understand the severity of this predicament, though, we first need to understand just how very vulnerable, how plastic, our minds really are.

  The plasticity of our minds is a marvelous thing to behold. In your brain alone, your billions of neurons are tied to each other by trillions of synapses, a portion of which are firing right now, forging (by still mysterious means) your memory of this sentence, your critique of this very notion, and your emotions as you reflect on this information. And these transmissions play out, we’re finding, in a highly organic and malleable fashion. Our brains are so plastic, so open-minded, that they will reengineer themselves to function optimally in whatever environment we give them. Repetition of stimuli produces a strengthening of responding neural circuits. Neglect of other stimuli will cause corresponding neural circuits to weaken. (Grannies who maintain their crossword puzzle regime knew that already.)

  And as crossword-puzzling grandmothers know, it is not only the brains of the young that are vulnerable to environmental influence. While many still think that our personalities—and our brains—effectively crystallize when we graduate high school, we now know that our brains in fact remain plastic, changeable, throughout our lives. No matter your age, your brain’s ability to think, to feel, to learn, is minutely different from the way it was yesterday. What you think and how you think are up for grabs.

  This plasticity is the ultimate consolation for the perennial “nature vs. nurture” argument, by the way. Evolution (nature) endowed us with minds capable of fast and furious transformation, minds able to adapt to strange new environments (nurture) within a single lifetime—even within a few weeks. Therefore, we’re always products of both inherited hardware and recently downloaded software. We are each a brilliant symbiosis of nature and nurture.

  UCLA’s Gary Small is a pioneer of neuroplasticity research, and in 2008 he produced the first solid evidence showing that our brains are reorganized by our use of the Internet. He placed a set of “Internet naïve” people in MRI machines and made recordings of their brain activity while they took a stab at going online. Small then had each of them practice browsing the Internet for an hour a day for a mere week. On returning to the MRI machine, those “naïve” folk now toted brains that lit up significantly in the frontal lobe, where there had been minimal neural activity beforehand. Neural pathways quickly develop when we give our brains new tasks, and Small had shown that this held true—over the course of just a few hours, in fact—following Internet use.

  “We know that technology is changing our lives. It’s also changing our brains,” he announced. On the one hand, neuroplasticity gives him great hope for the elderly. “It’s not just some linear trajectory with older brains getting weaker,” he told me. Your brain’s ability to empathize, for example, will increase as you age. The flip side of all this, though, is that young brains, immersed in a dozen hours of screen time a day, may be more equipped to deal with digital reality than with the decidedly less flashy reality reality that makes up our dirty, sometimes boring, often quiet, material world.

  In The Shallows, Nicholas Carr describes how the Internet fundamentally works on our plastic minds to make them more capable of “shallow” thinking and less capable of “deep” thinking. After enough time in front of our screens, we learn to absorb more information less effectively, skip the bottom half of paragraphs, shift focus constantly; “the brighter the software, the dimmer the user,” he suggests at one point.

  The most startling example of our brain’s malleability, though, comes from new research by neural engineers who now suggest that our children will be able to “incept” a person “to acquire new learning, skills, or memory, or possibly restore skills or knowledge that has been damaged through accident, disease, or aging, without a person’s awareness of what is learned or memorized.” I am quoting here from a report issued from a Boston University team led by Takeo Watanabe. His team was able to use decoded functional magnetic resonance imaging (fMRI) to modify in highly specific ways the brain activity in the visual cortex of their human subjects. “Think of a person watching a computer screen,” suggested the National Science Foundation when it announced the research, “and having his or her brain patterns modified to match those of a high-performing athlete.” The possibilities of such injections of “unearned” learning are as marvelous as they are quagmires for bioethical debate. Your grandchild’s brain could be trained in a certain direction while watching ads through digital contact lenses without his or her awareness (or, for that matter, acquiescence). In other words, decoded neurofeedback promises truly passive learning, learning without intention from the person who is to be “informed.”

  For now, it’s easier to tell that something has changed in our minds, but we still feel helpless against it, and we even feel addicted to the technologies that are that change’s agents.

  But will our children feel the static? Will X Factor audition videos replace basement jam sessions? Will “deep” conversation and solitary walks be replaced by an impoverished experience of text clouds? Will the soft certainty of earlier childhood be replaced by the restless idleness that now encroaches? Our children will always have their moments of absence, of course—their lives will not be wholly zombielike but will be a mixture of connection and disconnection. They will get lost in the woods, they will run naked on beaches, they will sometimes shut off their devices. The important question is whether the bias is shifting—whether they’ll find it as easy to access absence and solitude. What’s important is that we become responsible for the media diets of our children in a way that past generations never were. Since our children are privy to a superabundance of media, we now need to proactively engineer moments of absence for them. We cannot afford to count on accidental absence any more than we can count on accidental veggies at dinner.

  Without such engineered absences (a weekend without texting, a night without screens), our children suffer as surely as do kids with endless access to fast food. The result is a digital native population that’s less well rounded than we know they could be. In 2012, Elon University worked with the Pew Internet and American Life Project to release a report that compiled the opinions of 1,021 critics, experts, and stakeholders, asking for their thoughts on digital natives. Their boiled-down message was that young people now count on the Internet as “their external brain” and have become skillful decision makers—even while they also “thirst for instant gratification and often make quick, shallow choices.” Some of those experts were optimistic about the future brains of the young. Susan Price, CEO and chief Web strategist at San Antonio’s Firecat Studio, suggested that “those who bemoan the perceived decline in deep thinking . . . fail to appreciate the need to evolve our processes and behaviors to suit the new realities and opportunities.” Price promises that the young (and those who are young at heart) are developing new skills and standards better suited to their own reality than to the outmoded reality of, say, 1992.

  Those “new standards” may, one presumes, place a priority on the processing of information rather than the actual absorption of information. In Socrates’ terms, we’re talking about reminiscence instead of memory, and the appearance of omniscience. Meanwhile, the report’s coauthor, Janna Anderson, noted that while many respondents were enthusiastic about the future of such minds, there was a clear dissenting voice:

  Some said they are already witnessing deficiencies in young people’s abilities to focus their attention, be patient and think deeply. Some experts expressed concerns that t
rends are leading to a future in which most people become shallow consumers of information, endangering society.

  Several respondents took the opportunity, in fact, to cite George Orwell’s dystopian fantasy 1984. Citizens are always manipulated by some authority or other, but an Orwellian future all but wipes out consciousness of (and criticism of) the subjugation of the masses. In order to keep youths noticing those manipulations in our own pseudo-Orwellian world, we first need to teach them how our technologies evolved.

  • • • • •

  Charles Darwin’s The Origin of Species may have outlined, back in 1859, an idea that explains our children’s relationship with iPhones and Facebook. Here’s the elevator-pitch version of his book: If you have something that copies itself with slight variations, and if that something exists in a competitive environment that will weed out those less suited to the given environment, then you must get what the American philosopher Daniel Dennett has called “design out of chaos without the aid of mind.” Evolution is not, then, some magical occurrence, but a mathematical certainty. Given an item’s ability to copy itself with variation, and given a competitive environment, you must have evolution.

  So is the goop of our DNA the only thing in the universe that can meet Darwin’s requirements for evolution? The English evolutionary biologist Richard Dawkins took the next logical step in 1976 and coined one of the most important, misunderstood, and bandied-about terms of our age: the meme.

  The “meme” (from the ancient Greek mimeme, which means “that which is imitated”) is an extension of Darwin’s Big Idea past the boundaries of genetics. A meme, put simply, is a cultural product that is copied. A tune is one; so is a corporate logo, a style of dress, or a literary cliché like “the hero’s journey.” We humans are enamored of imitation and so become the ultimate “meme machines.” The young are best of all: Twerking videos and sleepover selfies are memes par excellence. Memes—pieces of culture—copy themselves through history and enjoy a kind of evolution of their own, and they do so riding on the backs of successful genes: ours.

 

‹ Prev