Still, Schulze’s expectation demonstrates the transition over time to more reliable airplanes and engines. They simply don’t fail as they did in the early days of air travel. What has not changed is the fallible human, unpredictable at every level except for how reliably he or she will make mistakes. Something had to be done about making people perform better. Mechanical engineers and aerodynamicists had their place, but beginning in the 1970s, a new kind of specialist was digging into the “soft” sciences of psychology, ergonomics, communication, and design. These people were practicing in a relatively new field called human factors. They worked on ways to help airlines select and train pilots. Programs widely used in the military were adapted for civilian airlines. They researched how to enhance the flight deck and improve communications—what, in the jargon of aviation, is called information transfer—so that misunderstandings could be averted and errors prevented before they ended in tears. When a disaster happened, it could be turned into a valuable learning opportunity. The three biggest developments in human factors were triggered by pilot error, including a development as simple as it is profound: the checklist.
In 1935 three airplane manufacturers were competing to provide the U.S. Army Air Corps with a bomber capable of carrying a ton of ordnance a distance of two thousand miles. Boeing thought it had just the thing: an all-metal, four-engine model it called the Boeing 299. A Seattle Times reporter named Richard Williams, upon seeing the enormous new airplane for the first time, reportedly exclaimed, “Why, it’s a flying fortress!” and the nickname stuck.
Douglas and Martin were also in the running with twin-engine models. The Martin 146 and the Douglas 1B could carry the load, but neither could go the range. Even before the flying demonstration began, the army procurement officers were pushing to buy sixty-five Flying Fortresses. The competition was Boeing’s to lose.
On October 30, the day of the fly-off at Wright Field in Dayton, Ohio, two army pilots, two men from Boeing, and a representative of the engine manufacturer Pratt & Whitney climbed in and took off. There were three pilots aboard, Ployer P. Hill and Donald Putt from the army, and Boeing’s chief test pilot, Leslie Tower. Yet as they taxied across the airfield, none of them noticed that the elevators and rudder, movable panels that control the plane’s up-and-down and side-to-side motion, were still secured with a gust lock that kept them in place so they did not swing in the wind and get damaged when the plane was on the ground. But when the plane took off it was locked into a configuration for a steep climb that the pilots could not correct in the air. The giant aircraft stalled and fell to the ground, erupting in flames. Two of the men died from their injuries.
Before that flight, the only negative about the Boeing 299 was that it might be too complex, but it was a simple oversight that brought it down, literally and figuratively. The plane was disqualified from consideration by the Army Air Corps, and the big bomber order went to Douglas.
What could be done to protect against forgetfulness? The army pilots got together and came up with a checklist. In twenty-four steps, from “before taxi” to “after takeoff,” the pilot would be reminded of each critical task. Boeing went on to build the B-299, and the Army Air Corps bought it and flew it for decades. The Flying Fortress entered the history books. So, too, did the pilot checklist.
Checklists aren’t a perfect solution. Every fix has unintended consequences. The same checklist read eight times a day might not be met with the same level of concentration, a phenomenon that former NASA scientist and human factors expert Dr. Key Dismukes calls “seeing but not seeing.”
The number of checklists on an uneventful journey is about a dozen. On an eventful one, such as Qantas Flight 32, an Airbus A380 jumbo jet that experienced an uncontained engine failure shortly after takeoff in 2010, the pilots went through about one hundred twenty checks. Due to the way the Rolls-Royce Trent 900 engine exploded, Capt. Richard de Crespigny and his first officer, Matt Hicks, had one engine out; the three others not working properly; an inability to transfer fuel; problems with electrics, communication, flight controls, hydraulics, and pneumatics; and “a whole bunch of other stuff,” as de Crespigny described it.
They were busy assessing the condition of the plane and planning their next steps while emergency checklists kept appearing on the flight display “like dinner plates at an all-you-can-eat buffet,” according to de Crespigny. “I think I invented the term checklist fatigue,” he said—but he hadn’t. De Crespigny just had the most high-profile experience with the phenomenon of checklists overwhelming pilots, characterized as “stop interrupting me when I’m busy” and identified by Dismukes back in 1993.
Dismukes wasn’t calling for an end to checklists, which, if used properly, might have prevented disasters such as Helios Flight 522, where pilots seem to have neglected to turn on the cabin pressurization switch; or the Spanair Flight 5022 crash that followed an attempted takeoff without flaps. He was recognizing that as machines grow in complexity, every part of how humans interact with them must evolve, too. “How complex is too complex?” is a question that would come up again and again.
In the summer of 2009, the sixty-year-old captain of a Continental Boeing 777 keeled over at the controls of a flight carrying 247 people from Brussels to Newark. Passengers heard flight attendants ask if there was a doctor on board. There was, but it was too late for Craig Lenell. All airlines are required to have two pilots, and on flights over a certain number of hours, there can be three or more: the crew flying and a pilot or crew in reserve.
“If something happens physiologically to one of the pilots, the other one is seamlessly able to carry on,” ABC News aviation consultant and retired airline pilot John Nance said.
A pilot dying at the controls is pretty rare. The benefit of having two pilots is realized far more often in less dramatic circumstances. As Nance explained, two pilots provide “two brains and two sets of eyes” for the flight. At its best, a two-pilot crew operates as a precision team sharing a common view of the task before them, and separate views of each other. The terms challenge and response and monitoring and cross-check and the information transfer I mentioned earlier describe this relationship, which might also be called communication.
Entire books have been written on how to improve the way pilots communicate, because without special training, pilots can misunderstand each other as easily as any other two people who meet briefly and maybe for the first time and set out to accomplish a task together. Yet in the cockpit, the stakes are higher.
The deadliest aviation accident ever, the collision of a KLM Royal Dutch Boeing 747 with another 747 being operated by Pan American World Airways, made it very clear that more attention needed to be paid to the seemingly simple task of talking and the complex phenomenon of hierarchy on the flight deck. The crash happened on March 27, 1977, on a runway in Tenerife, in the Canary Islands.
Both planes and several others had been diverted from Las Palmas in Gran Canaria to the Los Rodeos Airport in Tenerife. A bomb at the Las Palmas airport terminal had shut it down, and three hours passed before it was reopened. When it was, airline crews started preparing to leave Tenerife for the short flight to their original destination, Las Palmas.
On the KLM plane the pilots in the cockpit were under the command of fifty-year-old Jacob van Zanten, head of flight training and, as the face on the airline magazine ads, something of a superstar in the Netherlands. The delay must have weighed on van Zanten, who was operating along with the other pilots under new flight time restrictions. If they were held up much longer, they would not be allowed to make the return flight from Las Palmas to Amsterdam.
If that happened, hundreds of travelers would have to be accommodated in hotels, and the jumbo jet would sit at the airport overnight instead of providing revenue to the carrier. And there was another factor to consider: the commander’s ego. As Nance explained, there is “the embarrassment of a senior leader in being unable to make happen what he wanted to happen.” In the years to come, this and ot
her similarly subtle pressures would be explored more thoroughly for their impact on the decisions made that fateful afternoon.
There were far more planes at Los Rodeos than gates at which to park them, so planes parked on a few of the taxiways. But this created a new problem: they were blocking the way to the runway for departing flights.
Controllers told the departing crews to follow an unusual procedure known as backtaxiing. One plane was to taxi down the runway followed by the second. When the first plane arrived at the departure threshold, it would make a one-hundred-eighty-degree turn into takeoff position.
The following plane had to pull off onto a taxiway to get out of the way of the plane positioned for takeoff.
When the first plane departed the second crew would taxi their plane into position and go.
Two planes had already taken off, and Pan Am and the KLM wide-bodies were next. KLM led the way, followed by Clipper Victor, under the command of the coincidentally named Victor Grubbs, fifty-six. Robert Bragg was the first officer and George Warns was the flight engineer.
Rumbling down the runway, Bragg, 39 at the time, recalled that the skies were clear, but before the Pan Am plane could find the taxiway where it was to pull off to get out of the KLM’s take-off path a dense fog rolled in. “Our visibility went from unlimited to 500 meters in under one minute. The tower even made a call stating, ‘Gentlemen, be advised that runway visibility is 500 to 700 metres,’” Bragg, thirty-nine, wrote in an article for Flight Safety Australia. The fog was so thick that the Pan Am crew determined it was below the takeoff minimum and assumed that the runway was now closed. Grubbs, a pilot with twenty-one thousand flight hours, continued to steer the plane through the fog, but slowly. All three men strained to find the turnoff.
On the takeoff end of the runway, however, the fog had passed. Van Zanten pivoted the KLM airplane around to point in the direction of takeoff. His plane was now head to head with the slow-moving Pan Am jumbo, unseen in the cloud a half mile ahead. He pushed forward on the throttles—which seemed to startle his first officer, thirty-two-year-old Klaas Meurs. “Wait a minute, we don’t have ATC clearance,” Meurs said. The first officer had experience with this captain. It was van Zanten who had qualified him on the 747 just two months earlier.
Van Zanten pulled back the power and instructed Meurs to call ATC. The first officer radioed that he was ready for takeoff and waiting for clearance. The tower controller replied with departure and navigational information, but didn’t issue the clearance to take off.
Just as Meurs was confirming the instructions, van Zanten said, “We go, check thrust.” Once again, the captain fed fuel to the airplane’s engines.
Meurs cued the mic and read back the controller’s words, adding something that sounded like “We are now at takeoff.”
In an analysis of the accident by the ALPA, this ambiguous phrase would receive a lot of scrutiny. The pilots concluded that the KLM first officer thought something was wrong with Captain van Zanten’s decision and was “trying to alert everyone on frequency that they were commencing takeoff.”
Captain Grubbs, on the Pan Am Clipper, heard the transmission and was surprised. “No,” he said, followed a second later by the controller telling KLM, “Stand by for takeoff. I will call you.”
“And we are still taxiing down the runway,” Pan Am’s First Officer Bragg added.
The Pan Am pilots must have thought they were making it clear to KLM that the Pan Am 747 was still in the way, but these messages were drowned out by the dueling transmissions, which created a shrill noise in the KLM cockpit.
“Report when runway clear,” the controller told the Pan Am crew, and Pan Am responded, “Okay, we will report when clear.”
The KLM 747 was accelerating down the runway; its first officer, Meurs, and flight engineer, Willem Schreuder, heard this. It prompted Schreuder to ask, “Is he not clear, then?”
“What do you say?” Captain van Zanten asked.
“Is he not clear, then, that Pan American?” Schreuder repeated.
“Oh yes,” van Zanten and Meurs answered at the same time.
Captain van Zanten had made his decision. “To reassess that decision at such a critical point in the takeoff may have seemed an intolerable idea,” the human factors specialists at ALPA concluded in their report, citing the other factors that must have been on the captain’s mind, including the heavy airplane, wet runway, and poor visibility.
The KLM Boeing 747 was closing in on the other jumbo jet still obscured in the soup.
On Clipper Victor, Captain Grubbs was startled to see the lights of the KLM airliner rapidly approaching through the mist.
“Goddamn, that son of a bitch is coming straight at us,” he said. He applied power to the throttles and turned the nose wheel to the left in a desperate attempt to get out of the way.
At this point, KLM’s van Zanten also saw the impending collision. With his aircraft moving too fast to stop, he pulled back on the yoke in a frantic effort to take off over the Pan Am Clipper. The tail of the KLM plane scraped along the runway as the front end lifted enough for the nose gear to clear the dome of the Pan Am 747.
Having been seated on the right side of the Pan Am flight deck facing directly toward the KLM jet, First Officer Bragg remembers his horror. “Get off, get off, get off,” he screamed at Grubbs as the underside of the KLM plane rose above him.
“I ducked, closed my eyes, and prayed, ‘God, let him miss us.’ When it did hit our plane, it was only a very short, quiet shudder. I actually thought that he had, in fact, missed us until I opened my eyes.”
As the KLM 747 dragged its undercarriage across the upper deck of the Clipper Victor, it tore off a huge section and then slammed back down onto the runway, skidding another fifteen hundred feet in a burst of sparks and explosions as the fuel spraying from ruptured tanks ignited.
All 249 people aboard the KLM flight were killed. Of the 396 people on the Pan Am jet, 70 survived the crash (though nine died later), a fact Bragg credits to Grubb’s quick work getting at least the front end of the plane out of the way.
Among the survivors, some said the impact was like a bomb going off. Its effect on the airline industry was equally explosive. It was not that two airliners had collided; there had been others. But the number of casualties and the cascading series of communication failures was a loud wakeup call to the industry.
Everywhere one looked, errors had been made, most obviously by the three men on the KLM flight deck, who failed to communicate their concerns clearly. The collision also revealed the danger of aviation’s long-established “right stuff” religion. The dogma consists of the belief that the captain is always right and that good pilots never make mistakes. In a 1990 article for the Flight Safety Foundation, Robert Besco, a retired airline captain and a consultant in human performance, wrote, “Pilots have adopted an attitude of risk denial.” If the captain was considered God and everyone else a congregant, you can see how pilots would not/could not speak up even when they saw that something was wrong.
John Lauber, working at NASA’s Ames Research Center in California at the time, had already spent several years noting the growing disparity between the reliability of the machine and that of the human flying it. He had visited airlines and spoken about a new concept he called cockpit resource management, or CRM. One of the airlines he visited was KLM Royal Dutch. One of the pilots he met was Captain van Zanten.
Lauber remembered, “He was a very impressive guy, a blond, steely-eyed airline pilot. He was a strong-minded personality.” Lauber was pitching CRM as something airlines could use to train pilots to better manage their workplace. Yet they “had not done anything in terms of developing programs that address these issues,” Lauber recalled.
CRM was about more than teaching communication and moderating hierarchy. It was intended to help pilots manage a wide array of pilot errors that Lauber had seen in a review of eighty accidents in the 1970s and ’80s. The one that stood out was the flight into
terrain of a brand-new Lockheed L-1011 in Miami in 1972.
Capt. Bob Loft was one of Eastern Airlines’ most senior captains, with thirty thousand hours of experience. His first officer, former air force pilot Bert Stockstill, had six thousand hours and even more time in the L-1011 than Loft had. The flight engineer, Don Repo, was a twenty-five-year employee with Eastern with nearly sixteen thousand flight hours.
As Eastern Airlines Flight 401 approached Miami International Airport in late December 1972, the nose wheel landing gear indicator light did not illuminate. Without knowing whether the failure was of the light or of the gear, the captain did a go-around. Cleared to fly at two thousand feet, the crew began diagnosing the situation, and during the process of removing the light fixture and trying to reinsert it, someone inadvertently turned off the altitude hold. This went unobserved by the three pilots and the mechanic occupying the jump seat because all four men were trying to figure out whether the problem was with the gear or just the warning light.
As the plane dropped from its assigned altitude, an alert began to sound, but no one seemed to hear it. They were close to concluding that it was just a bad bulb when Stockstill noticed the plane’s descent.
“We did something to the altitude,” the first officer said.
“What?” said Captain Loft.
“We’re still at two thousand, right?”
Loft’s final words showed his confusion: “Hey, what’s happening here?”
Flight 401 smashed into the Everglades, killing 111 of the 177 on board.
When John Lauber came across the detailed report he called it a “prototype” of an accident in which the crew does not manage the resources available. His cockpit resource management would teach pilots how to do this, in the same way businesses train their managers. “Pilots generally were well trained on aircraft systems and basic flying skills,” he said. But nothing was done to teach them what they needed to know for decision making, communication, and leadership.
The Crash Detectives Page 18