*2 The correct answers for these photos can be found in the notes on this page.
FOCUS
Cognitive Tunneling, Air France Flight 447, and the Power of Mental Models
When they finally found the wreckage, it was clear that few of the victims had realized disaster was near even as it struck. There was no evidence of passengers’ last-minute buckling of seatbelts or frenzied raising of food trays. Oxygen masks were firmly encased in ceiling panels. A submarine probing the wreckage at the bottom of the Atlantic Ocean found a whole row of seats upright in the sand, as if waiting to fly again.
It had taken almost two years to find the plane’s data recorders and everyone hoped that once they were retrieved, the cause of the accident would become clear at last. Initially, however, the recorders offered few clues. None of the plane’s computers had malfunctioned, according to the data. There was no indication of mechanical failure or electrical glitch. It wasn’t until investigators listened to the cockpit voice recordings that they began to understand. This Airbus—one of the largest and most sophisticated aircraft ever built, a plane designed to be an error-proof model of automation—was at the bottom of the ocean not because of a defect in machinery, but because of a failure of attention.
Twenty-three months earlier, on May 31, 2009, the night sky was clear as Air France Flight 447 pulled away from the gate in Rio de Janeiro with 228 people on board, bound for Paris. In the cabin were honeymooners and a former conductor for the Washington National Opera, a well-known arms control activist, and an eleven-year-old boy headed to boarding school. One of the plane’s pilots had brought his wife to Rio so they could enjoy a three-day layover at the Copacabana Beach. Now she was in the back of the massive aircraft, while he and two colleagues were in the cockpit, flying them home.
As the plane began its ascent, there were a few radioed exchanges with air traffic control, the standard chatter that accompanies any takeoff. Four minutes after lifting from the runway, the pilot in the right seat—the junior position—activated the autopilot. For the next ten and a half hours, if all went according to plan, the plane would essentially fly itself.
Just two decades earlier, flying from Rio to Paris had been a much more taxing affair. Prior to the 1990s and advances in cockpit automation, pilots were responsible for calculating dozens of variables during a flight, including airspeed, fuel consumption, direction, and optimal cruising altitude, all while monitoring weather disturbances, discussions with air traffic control, and the plane’s position in the sky. Such trips were so demanding that pilots often rotated responsibilities. They all knew the risks if vigilance waned. In 1987, a pilot in Detroit had become so overwhelmed during takeoff that he had forgotten to set the wing flaps. One hundred and fifty-four people died when the plane crashed after takeoff. Fifteen years before that, pilots flying near Miami had become fixated on a faulty landing gear light and had failed to notice that they were gradually descending. One hundred and one people were killed when the craft slammed into the Everglades. Before automated aviation systems were invented, it wasn’t unheard of for more than a thousand people to die each year in airplane accidents, often because pilots’ attention spans were stretched too thin, or due to other human errors.
The plane flying from Rio to Paris, however, had been designed to eliminate such mistakes by vastly reducing the number of decisions a pilot had to make. The Airbus A330 was so advanced that its computers could automatically intervene when problems arose, identify solutions, and then tell pilots, via on-screen instructions, where to direct their focus as they responded to computerized prompts. In optimal conditions, a human might fly for only about eight minutes per trip, during takeoff and landing. Planes like the A330 had fundamentally changed piloting from a proactive to a reactive profession. As a result, flying was easier. Accident rates went down, and airlines’ productivity soared because more customers could travel with less crew. A transoceanic flight had once required as many as six pilots. By the time of Flight 447, thanks to automation, Air France needed only two people in the cockpit at any given time.
Four hours into the trip, midway between Brazil and Senegal, the plane crossed the equator. Most of the passengers would have been asleep. There were clouds from a tropical storm in the distance. The two men in the cockpit remarked on static electricity dancing across the windows, a phenomenon known as St. Elmo’s fire. “I’m dimming the lighting a bit to see outside, eh?” said Pierre-Cedric Bonin, the pilot whose wife was in the passenger cabin. “Yes, yes,” the captain replied. There was a third aviator in a small hold behind the cockpit, taking a nap. The captain summoned the third man to switch places, and then left the two junior pilots at the controls so he could sleep. The plane was flying smoothly on full autopilot at thirty-two thousand feet.
Twenty minutes later there was a small bump from turbulence. “It might be a good idea to tell the passengers to buckle up,” Bonin informed a stewardess over the intercom. As the air surrounding the cockpit cooled, three metal cylinders jutting from the craft’s body—the pitot tubes, which measure airspeed by detecting the force of air flowing into them—became clogged with ice crystals. For almost a hundred years, aviators have complained about, and safely accommodated, ice in pitot tubes. Most pilots know that if their airspeed measurement plunges unexpectedly, it’s likely because of clogged pitot tubes. When the pitot tubes on Flight 447 froze over, the plane’s computers lost airspeed information and the auto-flight system turned off, as it was programmed to do.
A warning alarm sounded.
“I have the controls,” Bonin said calmly.
“Okay,” his colleague replied.
At this point, if the aviators had done nothing at all, the plane would have continued flying safely and the pitot tubes would have eventually thawed. But Bonin, perhaps shaken out of a reverie by the alarm and wanting to offset the loss of the autopilot, pulled back a bit on the command stick, causing the plane’s nose to nudge upward and the aircraft to gain altitude. Within one minute, it had ascended by three thousand feet.
With Flight 447’s nose now pointed slightly upward, the plane’s aerodynamics began to change. The atmosphere at that height was thin, and the ascent had disrupted the smooth flow of air over the plane’s wings. The craft’s “lift”—the basic force of physics that pulls airplanes into the sky because there is less pressure above a wing than below it—began deteriorating. In extreme conditions, this can cause an aerodynamic stall, a dangerous situation in which a plane starts falling, even as its engines strain with thrust and the nose points skyward. A stall is easy to overcome in its early stages. Simply lowering the nose so air begins flowing smoothly over the wings prevents a stall from emerging. But if a plane’s nose remains upward, a stall will become worse and worse until the airplane drops like a stone in a well.
As Flight 447 rose through the thin atmosphere, a loud chime erupted in the cockpit and a recorded voice began warning, “Stall! Stall! Stall! Stall!,” indicating that the plane’s nose was pointed too high.
“What’s this?” the copilot said.
“There’s no good…uh…no good speed indication?” Bonin responded. The pitot tubes were still clogged with ice and so the display did not show any airspeed.
“Pay attention to your speed,” the copilot said.
“Okay, okay, I’m descending,” Bonin replied.
“It says we’re going up,” the copilot said, “so descend.”
“Okay,” said Bonin.
But Bonin didn’t descend. If he had leveled the plane, the craft would have flown on safely. Instead, he continued pulling back on the stick slightly, pushing the airplane’s nose further into the sky.
Automation has today penetrated nearly every aspect of our lives. Most of us now drive cars equipped with computers that automatically engage the brakes and reduce transmission power when we hit a patch of rain or ice, often so subtly we never notice the vehicle has anticipated our tendency to overcorrect. We work in offices where customers are ro
uted to departments via computerized phone systems, emails are automatically sent when we’re away from our desks, and bank accounts are instantaneously hedged against currency fluctuations. We communicate with smartphones that finish our words. Even without technology’s help, all humans rely on cognitive automations, known as “heuristics,” that allow us to multitask. That’s why we can email the babysitter while chatting with our spouse and simultaneously watching the kids. Mental automation lets us choose, almost subconsciously, what to pay attention to and what to ignore.
Automations have made factories safer, offices more efficient, cars less accident-prone, and economies more stable. By one measure, there have been more gains in personal and professional productivity in the past fifty years than in the previous two centuries combined, much of it made possible by automation.
But as automation becomes more common, the risks that our attention spans will fail have risen. Studies from Yale, UCLA, Harvard, Berkeley, NASA, the National Institutes of Health, and elsewhere show errors are particularly likely when people are forced to toggle between automaticity and focus, and are unusually dangerous as automatic systems infiltrate airplanes, cars, and other environments where a misstep can be tragic. In the age of automation, knowing how to manage your focus is more critical than ever before.
Take, for instance, Bonin’s mindset when he was forced to take control of Flight 447. It is unclear why he continued guiding the plane upward after agreeing with his copilot that they should descend. Maybe he hoped to climb above the storm clouds on the horizon. Perhaps it was an unintentional reaction to the sudden alarm. We will never know why he didn’t return the controls to neutral once the stall warning sounded. There is significant evidence, however, that Bonin was in the grip of what’s known as “cognitive tunneling”—a mental glitch that sometimes occurs when our brains are forced to transition abruptly from relaxed automation to panicked attention.
“You can think about your brain’s attention span like a spotlight that can go wide and diffused, or tight and focused,” said David Strayer, a cognitive psychologist at the University of Utah. Our attention span is guided by our intentions. We choose, in most situations, whether to focus the spotlight or let it be relaxed. But when we allow automated systems, such as computers or autopilots, to pay attention for us, our brains dim that spotlight and allow it to swing wherever it wants. This is, in part, an effort by our brains to conserve energy. The ability to relax in this manner gives us huge advantages: It helps us subconsciously control stress levels and makes it easier to brainstorm, it means we don’t have to constantly monitor our environment, and it helps us get ready for big cognitive tasks. Our brains automatically seek out opportunities to disconnect and unwind.
“But then, bam!, some kind of emergency happens—or you get an unexpected email, or someone asks you an important question in a meeting—and suddenly the spotlight in your head has to ramp up all of a sudden and, at first, it doesn’t know where to shine,” said Strayer. “So the brain’s instinct is to force it as bright as possible on the most obvious stimuli, whatever’s right in front of you, even if that’s not the best choice. That’s when cognitive tunneling happens.”
Cognitive tunneling can cause people to become overly focused on whatever is directly in front of their eyes or become preoccupied with immediate tasks. It’s what keeps someone glued to their smartphone as the kids wail or pedestrians swerve around them on the sidewalk. It’s what causes drivers to slam on their brakes when they see a red light ahead. We can learn techniques to get better at toggling between relaxation and concentration, but they require practice and a desire to remain engaged. However, once in a cognitive tunnel, we lose our ability to direct our focus. Instead, we latch on to the easiest and most obvious stimulus, often at the cost of common sense.
As the pitot tubes iced over and the alarms blared, Bonin entered a cognitive tunnel. His attention had been relaxed for the past four hours. Now, amid flashing lights and ringing bells, his attention searched for a focal point. The most obvious one was the video monitor right in front of his eyes.
The cockpit of an Airbus A330 is a minimalist masterpiece, an environment designed to be distraction free, with just a few screens alongside a modest number of gauges and controls. One of the most prominent screens, directly in each pilot’s line of sight, is the primary flight display. There is a broad line running across the horizontal center of a screen that indicates the division between sky and land. Floating atop this line is the small icon of an aircraft. If a plane rolls to either side while flying, the icon goes off-kilter and pilots know their wings are no longer parallel to the ground.
PRIMARY FLIGHT DISPLAY
When Bonin heard the alarm and looked at his instrument panel, he saw the primary flight display. The icon of the plane on that screen had rolled slightly to the right. Normally, this would not have been a concern. Planes roll in small increments throughout a trip and are easily righted. But now, with the autopilot disengaged and the sudden pressure to focus, the spotlight inside Bonin’s head shined on that off-kilter icon. Bonin, data records indicate, became focused on getting the wings of that icon level with the middle of his screen. And then, perhaps because he was fixated on correcting the roll, he failed to notice that he was still pulling back on the control stick, lifting the plane’s nose.
As Bonin pulled back on his stick, the front of the aircraft rose higher. Then, another instance of cognitive tunneling occurred—this time, inside the head of Bonin’s copilot. The man in the left-hand seat was named David Robert, and he was officially the “monitoring pilot.” His job was to keep watch over Bonin and intervene if the “flying pilot” became overwhelmed. In a worst-case scenario, Robert could take control of the craft. But now, with alarms sounding, Robert did what’s most natural in such a situation: He became focused on the most obvious stimuli. There was a screen next to him spewing text as the plane’s computer provided updates and instructions. Robert turned his eyes away from Bonin and began staring at the scrolling type, reading the messages aloud. “Stabilize,” Robert said. “Go back down.”
Focused on the screen as he was, Robert didn’t see that Bonin was pulling back on his stick and didn’t register that the flying pilot was raising the craft higher even as he agreed they needed to descend. There is no evidence that Robert looked at his gauges. Instead, he frantically scrolled through a series of messages automatically generated by the plane’s computer. Even if those prompts had been helpful, nothing indicates that Bonin, locked on the airplane icon in front of him, heard anything his colleague said.
The plane rose through thirty-five thousand feet, drawing dangerously close to its maximum height. The nose of the airplane was now pitched at twelve degrees.
The copilot finally looked away from his screen. “We’re climbing, according to this,” he told Bonin, referring to the instrument panel. “Go back down!” he shouted.
“Okay,” Bonin replied.
Bonin pushed his stick forward, forcing the plane’s nose to dip slightly. As a result, the force of gravity against the pilots lessened by a third, giving them a brief sense of weightlessness. “Gently!” his colleague snapped. Then Bonin, perhaps overwhelmed by the combination of the alarms, the weightlessness, and his copilot’s chastisement, jerked his hand backward, arresting the descent of the plane’s nose. The craft remained at a six-degree upward pitch. Another loud warning chime came from the cockpit’s speakers, and a few seconds later the aircraft began to shake, what’s known as buffeting, the result of rough air moving across the wings in the early stages of a serious aerodynamic stall.
“We’re in, ahhh, yeah, we’re in a climb, I think?” Bonin said.
For the next ten seconds, neither man spoke. The plane rose above its maximum recommended altitude of 37,500 feet. To stay aloft, Flight 447 had to descend. If Bonin would simply lower the nose, all would be fine.
Then, as the pilots focused on their screens, the ice crystals clogging the pitot tubes cleared and the pl
ane’s computer began receiving accurate airspeed information once again. From this moment onward, all the craft’s sensors functioned correctly throughout the flight. The computer began spitting out instructions, telling the pilots how to overcome the stall. Their instrument panels were showing them everything they needed to know to right the plane, but they had no idea where to look. Even as helpful information arrived, Bonin and Robert had no clue as to where to focus.
The stall warning blared again. A piercing, high-pitched noise called the “cricket,” designed to be impossible for pilots to ignore, began to sound.
“Damn it!” the copilot yelled. He had already paged the captain. “Where is he?…Above all, try to touch the lateral controls as little as possible,” he told Bonin.
“Okay,” Bonin replied. “I’m in TO/GA, right?”
It is at this moment, investigators later concluded, that the lives of all 228 people on board Flight 447 were condemned. “TO/GA” is an acronym for “takeoff, go around,” a setting that aviators use to abort a landing, or “go around” the runway. TO/GA pushes a plane’s thrust to maximum while the pilot raises the nose. There is a sequence of moves associated with TO/GA that all aviators practice, hundreds of times, in preparation for a certain kind of emergency. At low altitudes, TO/GA makes a lot of sense. The air is thick near the earth’s surface, and so increasing thrust and raising the nose makes a plane go faster and higher, allowing a pilot to abort a landing safely.
But at thirty-eight thousand feet, the air is so thin that TO/GA doesn’t work. A plane can’t draw additional thrust at that height, and raising the nose simply increases the severity of a stall. At that altitude, the only correct choice is lowering the nose. In his startled panic, however, Bonin made a second mistake, a mental misstep that is a cousin to cognitive tunneling: He sought to aim the spotlight in his head onto something familiar. Bonin fell back on a reaction he had practiced repeatedly, a sequence of moves he had learned to associate with emergencies. He fell into what psychologists call “reactive thinking.”
Smarter Faster Better: The Secrets of Being Productive in Life and Business Page 8