A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention

Home > Other > A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention > Page 12
A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention Page 12

by Matt Richtel


  Ed, on the other hand, couldn’t stop worrying, though he didn’t say it aloud. He kept waiting for another shoe to drop. He just knew it was out there.

  A FEW DAYS BEFORE Thanksgiving, Rindlisbacher visited the stately Cache County Attorney’s Office on Main Street. The trooper had come to see one of the seven county attorneys, a guy named Tony C. Baird. The trooper found Baird standing near the reception area, glancing at some paperwork.

  “Hey, Tony. Can I chat with you for a second?”

  Baird led Rindlisbacher through the frosted glass door into his office. The pair stood and chatted beside two armchairs, beneath one of the dark wooden arches that are found throughout the grand building.

  Baird was around five foot seven and trim but muscular, with the body of a fitness devotee, short hair, a square jaw, and almost preternaturally straight teeth. He had an all-American face that only slightly betrayed the man’s competitive nature. An opposing counsel or a defendant might be rightfully wary that Baird could be a tough foe if provoked.

  Baird wore his standard fare, a black suit and white shirt he got from a department store. It was a lot fancier than his father imagined, even hoped, that Baird would wear to work.

  Baird had been groomed to take over the family dairy farm, located in Lewiston, a small farming town on the border of Utah and Idaho. “Dad threatened me with taking over the family farm, so I decided I’d try some of that college stuff,” explains Baird. He went to Utah State. Then his dad tried to get him to come back to the farm. “So I decided to try some of that law school stuff.”

  After graduating from BYU law school, he worked in several county attorney’s offices in the state, spent a year as the Logan city prosecutor, and in 1997, he joined the Cache County office. By 2006, he was chief criminal deputy.

  Baird, who got up regularly at 4:30 a.m. to train for triathlons, could appreciate Trooper Rindlisbacher’s tenacity. “He had a bit of a reputation,” Baird said. He’d heard that from Rindlisbacher himself. “He kind of goes after people.

  “But I wouldn’t have characterized it as overzealous.”

  That morning in late November, Rindlisbacher arrived with a common request: court approval to further investigate the Shaw accident. Specifically, he wanted to subpoena Reggie’s phone records. He briefed Baird on the case and added that he’d seen Reggie texting in the police car on the way to the hospital. Rindlisbacher said he thought Reggie was lying.

  Baird took it in. If the request for subpoena power was commonplace, the circumstances were not. Texting and driving? Sure, he’d heard a bit about it here or there, but never in the legal context.

  “It wasn’t something that had ever come across my desk. It wasn’t something we’d ever looked at before,” he says. He thought Rindlisbacher’s focus on the issue was “interesting,” a word that almost seems to suggest that it was a bit of a wild-goose chase. Baird wondered: Is there even any law on the books that would allow us to pursue this case?

  His gut instinct was that this case wouldn’t go anywhere. But Baird liked Rindlisbacher. And, besides, the legal standard for getting the court to allow the subpoena wasn’t particularly high. It was “good cause,” less than “probable cause,” a standard aimed at allowing the police to pursue the next level of investigation.

  Sure, Baird told Rindlisbacher. He asked the trooper to get together the affidavit facts to be submitted with the request to the court.

  Rindlisbacher was going to get his phone records—and, he hoped, his proof.

  CHAPTER 14

  THE NEUROSCIENTISTS

  AS IN SO MANY other fields, the study of attention broadened and deepened as the twentieth century sped along. And much of the gain owed to technology. Using emerging high-tech techniques, researchers pieced together the physical structures, down to the cellular level, involved with attention. Some of these developments were succinctly summarized in a book by Dr. Posner, the contemporary legend from the University of Oregon.

  In the 1970s, for instance, researchers used microelectrodes to hone in on a key part of the brain—called the parietal lobe—crucial to shifts in attention. In fact, as Dr. Posner would later help discover, patients with lesions to this part of the brain had more trouble shifting attention.

  Use of electrodes also allowed researchers to measure the time that it takes the brain to reallocate resources when attention shifts. Say someone saw a flash of light. About one hundred milliseconds after the introduction of this new visual stimulation, the person showed changes at the neurological level. Measures like these added precision to the understanding of how long it takes the brain to react, findings that were very much in the spirit of Helmholtz and mental chronometry, the study of brain structures.

  In the ensuing years, better technology meant better imaging—positron emission tomography (PET) scans, MRIs, fMRIs, EEGs. The technology allowed researchers to examine finer slices of the neurological networks, their sum and their parts. The network included parts of the brain like the anterior cingulate cortex, the dorsolateral prefrontal cortex, and, crucially, the prefrontal cortex. These areas would show changes—like increases and decreases in blood flow—when study subjects were trying to focus on something, or when they were overwhelmed with information, putting intensified pressure on the attention networks.

  In Dr. Posner’s book, Attention in a Social World, he writes that “it is now possible to view attention much more concretely as an organ system, with its own functional anatomy.”

  Scientists learned more about the capacities and limitations of vision, hearing, reaction time, the networks and parts of the brain most central to attention (and, in turn, to learning) and working memory—the short-term store of memory that people call upon as they maneuver through life. They broke attention down to component parts, or subnetworks, that are somewhat independent but not mutually exclusive: how you can control attention, how you sustain it, how you gather information and put it to use, how you inhibit interruption and keep out irrelevant information. And they began to understand the neural basis of the activity. At the Massachusetts Institute of Technology, researchers, working largely with monkeys, identified a powerful, multifunctional type of neuron in the prefrontal cortex. These executive control neurons, unlike many other specialized neurons in more primitive parts of the brain (a neuron in the visual cortex, for instance, might have the singular role of identifying the color red) seem to have the function of bringing together lots of information from disparate parts of the brain and organizing them, helping set direction, goal, and focus.

  At Princeton, researchers used imaging techniques in monkeys and humans to validate and further the understanding of what happens when the brain is asked to consider two different sources of information. When this happens, it appears to create a state of competition inside the brain. The source of visual stimulation that is most relevant to the person gets the lion’s share of the neurological resources. At the same time, as more neurological resources are put to attending to this “relevant” information, there appears to be a decline in the neurological resources put toward attending to the “less relevant” information. That might sound obvious. But the implications are significant. The research gets close to answering a very crucial question, particularly for the digital era: When a person is paying attention to one thing, do they automatically ignore other things, or is there a mechanism for allowing the person to modulate or control how much attention they pay to the other source of stimulation?

  A hypothesis born out of the Princeton work is that attention is a finite resource: Focusing on one source (a person, a mobile device, the road ahead of you, etc.) comes at the cost of lost awareness of everything else. If that hypothesis holds, we face real challenges when we focus on a phone while driving; we can’t just will ourselves to concentrate on both things because our brains are designed to put a huge emphasis on what we deem relevant information, to the point of suppressing brain activity devoted to something else.

  Despite this research,
Dr. Gazzaley believes this is a very open-ended question, and a crucial one. He thinks an argument can be made that the brain might be trained in its ability not just to attend but even to multitask. That’s another of the key emerging areas of science: Researchers explored the underlying mechanisms of focus, they also started to look at pushing the limits of attention. In other words, can the ability to focus, once more fully understood, be expanded?

  “Is there a limit to how good we can get?” Dr. Gazzaley asked one sunny afternoon in the summer of 2013, sitting in his office at UCSF. Over his desk, a row of books: The Handbook of Aging and Cognition; Attention and Time; The Cognitive Neuroscience of Working Memory. Beside them, a small kitschy bronze-colored statue of a monkey looking at a brain. In the corner of the desk, behind one of his two monitors, a Buddha statue.

  On a whiteboard, hanging next to his desk, scribbles of blue and red marker reflect a brainstorming meeting he had earlier in the day with a postdoctoral student working on experiments aimed at identifying the neurological basis for how people distribute their attention. Dr. Gazzaley explained that people tend to be good at focusing and doing so narrowly—in a relatively small physical space, but when they seek to attend to a broader space, they lose detail.

  The experiments use video game technology to challenge people to distribute attention, then use imaging to measure brain activity during the task. Could they teach people to distribute attention more widely?

  Dr. Gazzaley wore a black shirt and pants and black leather boots with a silver zipper up the side. A white five o’clock shadow crossed his face, and he looked tired, but he smiled. He said he had some good news.

  A study he’d been working on for four years could any day be accepted into Nature, perhaps the most prestigious journal in his field. The study explored whether specialized video games might be used to train older adults to do a better job at juggling two tasks at once, and might even help them become more focused over the long run. The study looked specifically at how older adults could be taught to improve attention using a driving simulator.

  “Getting into Nature for me is like winning an Emmy,” he said. He hoped to know in a few days. It would be great attention for Dr. Gazzaley’s efforts, personal validation. It would also validate this emerging field of study. Indeed, even to be seriously considered by Nature underscored the real-world applications of this new generation of neuroscience.

  The new body of work tied together the powerful new tools for looking inside the brain, with the seminal discoveries of people like Dr. Posner, with a line of science dating back to Broadbent and Treisman and their peers. Their work had focused on aviation; after all, airplane cockpits were the place where human beings were most acutely confronted with new technologies, powerful ones that taxed our neurological limitations. The cost of failure in the air was astronomical, in lives and dollars.

  From this rich tradition of aviation attention science—dating back to World War II—there grew a new field. It was the application of attention science to driving. One scientist in particular, David Strayer, played a pioneering role, though he faced great opposition when he first started his work.

  IN 1989, DR. STRAYER earned a PhD in psychology from the University of Illinois at Urbana-Champaign. He was developing a specialty in how people become experts and how they acquire skills, and how those skills get compromised. How do they process information? How does information overwhelm them?

  The University of Illinois was one of the foremost places in the world to look at the interactions between humans and technology, an area of study known as “human factors.” The university had a reputation for spawning world-class scientists exploring how to optimize use of technology in airplanes and in the military, in the tradition of Broadbent and Dr. Treisman. The basic idea: How to make machines work best for people without becoming overwhelming.

  That’s what Dr. Strayer figured he would do. In 1990, he went to work for GTE Laboratories. GTE was a telecommunications giant that did a lot of work in the consumer market, as well as for the government, including the military.

  And this was wartime. In January 1991, the United States entered the first Gulf War and was going after Saddam Hussein. As in prior wars, this one presented scientists with the challenge of helping soldiers and their commanders figure out how to take advantage of technology rather than be overwhelmed by it. But the technology issues were no longer just in the cockpit, affecting only pilots, they were everywhere.

  Communications tools and networks had become essential to success. Guys like Dr. Strayer were hired to figure out how to configure networks and displays so that the information was useful and life-preserving—not overwhelming and deadly.

  Dr. Strayer quickly realized that this work had widespread applications. And so, while at GTE, he started noodling a nonmilitary question, one that troubled him and he couldn’t shake. It had to do with the consumer side of GTE’s business, the part that was making mobile phones and had begun marketing them for cars. (Eventually, GTE would be acquired by Verizon, one of the biggest mobile phone providers in the world.) To Dr. Strayer, the idea of using a phone in a car was troubling, at least based on now decades of science establishing the limitations faced by pilots when they taxed their brains with too many visual inputs, sounds, and physical demands.

  He went to a supervisor at GTE and said, “Everything we know from aviation psychology indicates this is likely to be troublesome,” referring to the idea of a car phone. “Before we start marketing this, we should think about it.”

  Shortly thereafter, he heard from his supervisor that the company leadership wasn’t particularly interested in pursuing the safety question.

  “Why would we want to know this?” Dr. Strayer was told. “That will not help us sell anything.”

  Drivers were, arguably, the most important early market for mobile telecommunications. In fact, the first big commercial push for mobile phones were car phones.

  According to an article published in the New York Times, mobile phone companies, going back as far as the early 1980s, openly marketed the devices for use by drivers. One ad in 1984 asked, “Can your secretary take dictation at 55 mph?” The mobile companies concentrated their cell phone sites along highways, hoping to capture business from bored motorists. They succeeded. A longtime telecommunications analyst named Kevin Roe told the paper that 75 percent or more of wireless company revenue came from drivers well into the 1990s. “That was the business,” he was quoted as saying in the Times. Wireless companies “designed everything to keep people talking in their cars.”

  Seeing these trends in the early 1990s, Dr. Strayer decided he was interested in getting answers—with or without his company’s participation.

  He needed to be in a place where he could pursue answers. That turned out to be the University of Utah, in Salt Lake City. He got an assistant professorship and started putting together experimental protocols, trying to borrow from the masters who’d come before him—Broadbent, Treisman, and on and on—essentially adapting the experiments that had been done for pilots to car drivers and car phones.

  There was nothing revelatory about such an application. It made sense. But on a societal level, the research was nothing short of profound. After all, scientists had been focused for so long on technology used by an elite class of people—not just pilots, but the soldiers and others who were privileged to have access to the supernatural devices.

  But now the devices were becoming part of everyday life. Or they were poised to become so. Cockpits in cars, Dr. Strayer thought.

  Perhaps it was not surprising that Dr. Strayer couldn’t find a lot of funding for his research, given how many financial interests were arrayed in the other direction. So he did his first experiment at the University of Utah by cobbling together about $30 in parts. He built a primitive driving simulator.

  In the experiment, the subject (an undergrad from the university) would sit in a chair holding a joystick, which controlled a car on a computer screen. The s
ubject was told to do simple tasks: (1) follow along a curvy road, and (2) hit a button if a red light came on. That would cause the car to brake. At the same time, Dr. Strayer asked the subjects to talk on a cell phone. Some used a handheld phone, some hands-free, with a headset. Separately, he also had the subjects “drive” while listening to the radio or a book on tape.

  There was a huge difference in the “drivers’ ” results, depending on what activity they were doing. When they were talking on a phone they made twice the number of errors as when they were listening to the radio. The error rate was higher both when using a handheld and hands-free phone.

  Dr. Strayer says he was struck by the results: “There’s something about the phone conversation that’s really kind of unique.”

  In 2001, he presented his findings at an annual meeting of the Psychonomic Society; it was the very first study to show the effects of talking on a cell phone. The findings were very well received. Dr. Strayer had made an important connection between the past research in the attention/distraction field and the challenges faced by multitasking drivers.

  “We made a link between this and fifty years of research on attention and aviation,” he says. “The attentional limits we saw with pilots apply to drivers of cars. It was an important first step.”

  At the time, though, Dr. Strayer thought it might be a last step, too—a final statement. He didn’t realize the pace at which mobile phones would be adopted, and the extent to which they’d be used in all walks of life, all the time. Plus, he said he couldn’t find anyone to adequately fund the research; there was so much cultural momentum driving adoption and so many powerful business interests that no one really wanted to find out—or was interested in paying to find out—that the magical new technology had some side effects.

  And he was just beginning to understand something that would become much clearer as his research grew: The devices took a driver’s mind off the road, even when hands were on the wheel and eyes were looking ahead.

 

‹ Prev