A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention

Home > Other > A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention > Page 34
A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention Page 34

by Matt Richtel


  That is how all of this connects to technology, and how we use it in our daily lives. Reggie’s story, in the end, isn’t just warning us not to use technology behind the wheel. It is screaming out to us as a society: pay attention. Know the risk technology poses in your life, and tame it.

  Sure, there are lots of little rules, actions that a person might do to limit technology, like turning off the function on a computer or phone that flashes an instant update every time there is a new email or Facebook status update. Or by putting the phone in the trunk of the car when you drive, thereby saving yourself from fending off the impulse that assaults the bottom-up attention, which sends a desperate signal to the frontal lobe and interrupts the driving task.

  The journey that travels from Donders to Helmholtz, through Broadbent and Treisman, to Posner and Strayer, tells us a few basic things: That the brain is limited, lacks bottomless capacity, and isn’t particularly fast relative to computer technology. The technological advances defined by laws of Moore and Metcalfe got married and long since overtook what our brains can handle. And we know now that their union also presents the challenge of addiction, or, at least, serious compulsion. Even allowing for the fact that we can know these devices challenge our limitations, sometimes they are so compelling we can’t stop using them.

  To ignore this science is to engage in self-deception, to tell ourselves a lie.

  And it gets supported by neuroscience.

  REMEMBER THE CHOCOLATE CAKE study, the one that looked at the nutrition choices made by someone asked to remember numbers? It showed that a person’s decision making is impinged by something as simple as a modest memory task.

  The choices someone makes—and the clarity of their mind—directly relate to how much information the person is trying to juggle. There are a bunch of studies that support, directly and indirectly, the significance of not being overloaded with information.

  A remarkable bit of research was done with veterans returning from Iraq and Afghanistan. They were excitable, some suffering from post-traumatic stress disorder. The research, led by Emma Seppala, the associate director of the Center for Compassion and Altruism Research and Education at Stanford University, found that the veterans saw a significant decrease in their startle response when they did deep breathing. Put another way: They were less impulsive and responsive to external stimulation. Dr. Seppala says the research could be extended to help people become less responsive to their devices, to exhibit more top-down control of their environment, rather than have it drive them.

  Another study from the University of California at San Diego found that creativity is enhanced when people let their minds wander, or when they do the most menial of tasks. Other research reinforced the risks of overload to decision making and learning, or, looking at those studies from a different perspective, proved the clear value of taking breaks from constant stimulation, whether provided by technology or some other high-fidelity experience.

  A study at the University of Michigan compared people’s ability to learn and remember information under two different conditions. In one condition, the subjects walked in a dense urban environment and then tried to learn and remember information. In the other condition, the subjects walked in nature. Those subjects, the ones who walked in nature, did a better job at learning.

  To researchers, the study suggests the restorative value of nature but for a key reason: In nature, people are processing less information—not zero information (they are taking in the world around them)—but less dense, intense information. Even the background noise of a city can impact the ability to learn and remember, and can deplete neurological resources.

  In another study, researchers at the University of San Francisco appear to have found more evidence of the cost of constantly consuming information. The USF scientists discovered that when a rat (their test subject) has a new experience, say, exploring a new environment, it stimulates new patterns of brain activity. But, the researchers concluded, the rats don’t appear to encode that experience as learning and memory unless they have some downtime after the experience. In other words, if the rat continues to be awash in new experiences, and doesn’t give the brain a break, it doesn’t process the new information.

  The sum of these studies, and others along the same lines, offer a pretty clear road map to reaching a state of mind that allows for good decision making and better awareness of one’s life—the precursors to balance and happiness.

  Take breaks from stimulation.

  Put another way: Turn off the device for a sustained period—whether hours or days. And then, and here’s the really tough part, don’t fill the void left by the absence of stimulation with some other nonstop stimulation. That can be hard to do, the experts told me, when people are so accustomed to the constant stream of pings and external noises. A veritable panic can ensue—What do I do with this void? But it’s there, when things die down, that the learning and memory get strong but, more than that, that the greater powers of decision making come into play. When a person is clearheaded, the frontal lobe becomes freed of the humming and buzz of external pressure. That’s when you can decide what steps are best, what actions are wisest.

  “Don’t put down the phone just to get preoccupied by something else,” Soren Gordhamer told me for a story I did for the New York Times.He’s the organizer of Wisdom 2.0, a growing movement in the Bay Area aimed at helping people find balance in the modern world. “If you’re not careful, the same orientation you have toward your device, you’ll put toward something else.”

  The neuroscience, while providing support for the idea of taking breaks, also was providing the seeds of another solution to distraction: a high-tech solution—with Dr. Strayer and Dr. Gazzaley, among others, at the forefront as well.

  CHAPTER 49

  THE NEUROSCIENTISTS

  DR. STRAYER WAS INVITED in 2010 to the Transport Research Lab, a swanky, glass-paneled center in Workingham, about thirty miles west of Central London. He was there to talk about a group of individuals who he and Jason Watson were calling the “supertaskers.” It was this type of person whom Dr. Strayer had identified several years earlier, by accident, when he and his student discovered a study subject who seemed to be able to do two things at once without much cost to performance.

  In the interim, the pair had identified a handful of such people. There weren’t many. But they did seem to exhibit some profound characteristics, not just behaviorally, but neurologically. When the researchers scanned the subjects’ brains, they discovered that this tiny fraction of people seemed to have less activity going on in key portions of the frontal lobe. “Their brains were less metabolically active,” as Dr. Strayer explains it in slightly more scientific terms. In other words: When the people were doing two tasks, “their brains weren’t working as hard to do it.”

  Dr. Strayer and others developed a hypothesis that these people “are extremely good at neural efficiency.” Meaning: They don’t need as much of their brains to process information and therefore might be able to add on more tasks without overloading the system.

  The idea at the research lab in Workingham was something of a gimmick, a “media event” designed to show some of the emerging research. Scientists in Britain had found a few people who seemed particularly adept at multitasking—potential supertaskers—for Dr. Strayer to put through the paces. Most did okay, but there was one woman whose capacities blew Dr. Strayer away. She was in her late twenties, a high-level speed cyclist, Olympic caliber. To establish her cognitive baseline that day, the researchers had her do some math problems and memory challenges. She missed just one question.

  They put her in the high-fidelity driving simulator. Her score on the cognitive test rose, what little it could. She got a perfect score, as she was navigating intense roadways with ease. “I’d never seen it before,” Dr. Strayer said, adding that the challenge of the dual tasks “are designed to destroy most people.”

  After years of focusing on distraction, Dr. Strayer was fasc
inated by this group of people. He started collecting their DNA. What was it about these people? He ran a test on one of their neurotransmitters that regulates dopamine, which in turn is associated with attention. He came up empty. Nothing was easily presenting itself. The sample size was too small, the haystack too big, and the needle invisible.

  But the concept of how their brains work held promise for Dr. Strayer. “Maybe if we can understand something about extraordinary behavior, we can understand how ordinary people can do it.”

  And in this analysis, Dr. Strayer offers another key thought: Ordinary people aren’t supertaskers. Far from it. Ordinary people do better at a task when they focus on it, not try to juggle it with another task. Ordinary people aren’t like the woman in the experiment outside London. Ordinary people are like Reggie.

  The other thing Dr. Strayer’s work underscores is how the neuroscience community was looking for answers not just to how attention works, or how it can be improved through behavioral measures (deep breathing), but by tinkering with the brain itself. Could this be the answer? We don’t change the technology, we change us?

  A TWENTY-TWO-YEAR-OLD STUDY SUBJECT sits at a table in dim light. In her hands are an iPad. On her head is a cap with thirty-two wires coming from it. They are attached to electrodes, which measure the electrical activity in her brain.

  A few feet away, through a wall and over a closed-circuit camera, Dr. Gazzaley and a volunteer assistant named Jesse watch the study subject, a teacher who lives in the East Bay, just across the Bay Bridge from San Francisco.

  This experiment is part of Dr. Gazzaley’s $10 million new lab, located in the basement of the Sandler Neurosciences Center, five floors below where the neuroscientist has his office. Generally, academic labs sound much fancier than they are; often they are windowless rooms, cubbyholes, and former closets dug out and jerry-rigged by cash-strapped researchers, but this one is different.

  Dr. Gazzaley, though he is fighting for every dollar, has managed to oversee the construction of something sleek, more in his image. In two adjoining rooms, there are MRI machines. Here, in the control room, three monitors on the wall can show what is happening in each of the rooms.

  On one of the computers on the desk, a bunch of data begins to pour in. It is showing what’s happening inside the subject’s brain, the electrical signals. It is not decipherable in the moment, but eventually it will offer a picture of what is happening inside the study subject’s brain as she’s asked to distribute her focus more widely.

  On the screen of her iPad, she plays a relatively primitive video game that entails identifying targets that at first appear in the middle of the screen but get farther out on the periphery. What Dr. Gazzaley and his team want to do is not merely understand how the brain works when confronted with a demand to widen attention. They want to then put that knowledge to work to try to increase the brain’s capacity to broaden the limits of attention.

  “We don’t know if it’s pushable,” he says. He’s smiling, almost sheepish, as if to say: We’re going for it, but who knows.

  But if they’re pushing ideas here—trying things that just may not work—this is hardly mad science. In fact, in the summer of 2013, not long after this experiment got under way, Dr. Gazzaley and his team scored a huge scientific coup. They found out the journal Nature would publish their paper that described how they could train the brain to improve and sustain attention using a video game.

  The research, which took place over four years and cost $300,000, entailed training 174 adult subjects to play a targeted video game. The game was called NeuroRacer, a three-dimensional driving game that asked the subjects to drive while simultaneously doing perceptual tasks, like recognizing objects around them.

  “It is clear that multitasking behavior has become ubiquitous in today’s technologically dense world, and substantial evidence has accrued regarding multitasking difficulties and cognitive control deficits in our aging population,” reads the introduction to the Nature paper. Translation: As people get older, they have more and more trouble keeping up with switching tasks and focusing under demanding circumstances.

  As the subjects used the video game, the researchers studied their performance with behavioral tests, and used eye scanners and captured the electrical activity in their brain with EEG. As their research wore on, Dr. Gazzaley and his peers discovered something they found remarkable: Older subjects, age sixty and above, showed increased ability to focus. What was remarkable was that this increased capacity to sustain attention came not within the game, but after they’d finished, weeks after.

  “Critically, this training resulted in performance benefits that extended to untrained cognitive control abilities (enhanced sustained attention and working memory),” the paper concluded. The EEG research found intensification over time of theta waves in the prefrontal cortex, thought to be a measure of sustained attention. That change predicts a “boost in sustained attention and preservation of multitasking improvement six months later.”

  The findings, the paper said, “provide the first evidence . . . of how a custom-designed video game can be used to assess cognitive abilities across the lifespan, evaluate underlying neural mechanisms, and serve as a powerful tool for cognitive enhancement.”

  “The research shows you can take older people who aren’t functioning well and make them cognitively younger through this training,” Earl Miller, a neuroscientist at the Massachusetts Institute of Technology told me. “It’s a very big deal.”

  Indeed, the research earned kudos from a range of attention scientists. Among them is Daphne Bavelier, a neuroscientist at the University of Rochester, who has been one of the pioneers in trying to understand how technology could be used to expand and improve attention. In her case, she was able to show that young people playing off-the-shelf, shoot-’em-up video games showed an improved capacity to sustain attention, even after the game was over. Neither she, nor Dr. Gazzaley, believed their findings should be taken to mean that the simple act of playing a video game improves the brain, or, specifically, the attention networks. Rather, she says the technology—coupled with brain imaging—began to offer hope that real therapies could be developed for attention-related issues.

  She said Dr. Gazzaley had achieved a big leap by showing that a scientifically developed game can reverse or “rewire” some of the effects of aging—not just in the game, but months after the people stopped playing. “He got it to transfer to other tasks, which is really, really hard to do,” she says.

  For her, like Dr. Gazzaley, the applications are potentially widespread—someday we may be able to treat attention deficit disorder, learning disabilities, and attention-related memory loss without drugs, which can have side effects. And these technological therapies would have to be constructed, she said, so that they don’t invite their own side effects, like, say, overbuilding one neural network. “We know the brain is rewirable, the question is to rewire it properly.

  “We’re in the primitive age of brain training.”

  Echoing Dr. Gazzaley, she said she was thrilled at the idea of turning the tables on the technology, making it a slave in new ways rather than have it challenge the brain.

  “Technology is here to stay. Like any revolution, there are pluses and minuses. We need to harness that power to our advantage.”

  This has been, all along, what most excites Dr. Gazzaley—not just the hunt to understand attention, and technology’s impact on it, but the effort to expand it. Take, for instance, the room in his new lab next to the control center where he’s been examining the brain of the teacher wearing the EEG cap. In this other room, there are the latest video game machines, like Microsoft’s Kinect video game console, which uses motion-capture technology to allow a person to interact with the action on the screen. It’s hooked up to a forty-six-inch Samsung monitor, its empty cardboard box still sitting in the room. There’s ultra-modern science technology here, too, including an EEG cap that is wireless, meaning that it can send signals wit
hout being hooked up to wires. Dr. Gazzaley’s idea is to marry the consumer technology with the research tools to ultimately try to create ways for people to improve their attention, focus, and learning ability from the comfort of their own homes—using technology.

  “I want to help rebuild these circuits,” he says. “We can sculpt and shape the neural circuits to refine the brain. We can drastically improve people’s abilities.

  “This is my dream.”

  CHAPTER 50

  REGGIE

  IN 2012, REGGIE STARTED dating a woman named Britney, who was twenty-two at the time, three years younger than he. They’d been friends for a few years while they both worked for the Utah Jazz, the basketball team. By the fall, they were going out regularly—movies, bowling, miniature golf.

  She knew about the accident and seemed to accept him. She had her own struggles growing up, and they felt comfortable in each other’s presence, knowing they had imperfections.

  Reggie was living in an apartment complex in a busy part of Salt Lake City, surrounded by commercial strips, chain restaurants, and retailers. His second-floor, one-bedroom apartment was decorated in the way of a twenty-something male who worked a lot; frozen food in the freezer, a twin bed with a comforter he periodically took home to his mom to wash, a bookshelf with some schoolbooks and movies and video games he didn’t have time to play. In the small living room, there was a hand-me-down love seat and a chair, from a relative, and the one thing in the place that had a feel of personal touch: a poster of Muhammad Ali standing over a defeated Sonny Liston.

  Reggie loved Ali’s courage and tenacity. Reggie even played around with boxing. In the fall of 2012, he fought an amateur bout in Las Vegas, which drove Mary Jane nuts. Her little guy was going to put his precious head in harm’s way?

  But Reggie felt like he was trying to do some new things, live life in a more full way. He took piano lessons and had a public recital.

 

‹ Prev