Rebel Ideas- the Power of Diverse Thinking

Home > Other > Rebel Ideas- the Power of Diverse Thinking > Page 8
Rebel Ideas- the Power of Diverse Thinking Page 8

by Matthew Syed


  The top of the mountain that day was wreathed in mist, preventing the supporting team from observing their progress, but at 12.50 p.m. the clouds parted for a few moments. Noel Odell, one of their teammates, witnessed them high up on the north-east ridge, five hours behind schedule but ‘moving deliberately and expeditiously’ towards the peak. Neither Mallory nor Irvine was seen again until 1999, when Mallory’s corpse was found at 26,760 feet on the north face. The consensus of historians is that neither man made it to the summit.

  The dangers were palpable to every member of Hall’s team. They had witnessed the dead bodies that litter the mountainside, and had received stern warnings about the importance of supplementary oxygen. Since arriving at Base Camp at 17,600 feet, they had made three acclimatisation climbs. The first up the Khumbu Icefall – full of crevasses, moving ice, and the threat of avalanche – had taken them to 19,500 feet. The second and third had taken them first to 21,000 feet, then 23,500, each hour spent at altitude forcing their bodies and minds to become more accustomed to air which, at the peak itself, would contain a mere third of the oxygen at sea level.

  But now, well into the so-called Death Zone above 26,000 feet, they were in the most forbidding territory in all mountaineering. Hall had already decided that the team would require a turnaround time of 1 p.m., or 2 p.m. at the latest. If they hadn’t made it to the top by then, they would have to head back down. This wasn’t a technical judgement so much as a mathematical one. With three oxygen canisters per person, each providing around six to seven hours of gas, any later would be flirting with calamity. As Hall put it: ‘With enough determination, any bloody idiot can get up this hill. The trick is to get back down alive.’6

  The other complicating factor that day was that other teams were also attempting the summit, a common occurrence due to the global fascination of the Himalayan peak. The Mountain Madness team was led by Scott Fischer, an amiable American, and one of the most skilled alpine mountaineers in the world. His guides were also superb at their trade, including Anatoli Boukreev, who had climbed Everest twice. Among the clients were Sandy Pitman, an American mountaineer who, like Namba, had completed six of the Seven Summits. She was doing a daily video blog for NBC. Also on the slopes that day was a team, albeit much smaller, from Taiwan.

  By the time the sun peaked over the rim of the horizon at 5.15 a.m., Krakauer, a member of Hall’s group, had reached the crest of the south-east ridge. ‘Three of the world’s five highest peaks stood out in craggy relief against the pastel dawn,’ he would later write. ‘My altimeter read 27,600 feet.’7 It was a glorious sight, but elsewhere on the slopes tiny problems were beginning to accumulate.

  Ropes hadn’t been pre-installed above 27,400 feet, which led to logjams while they were fixed, Boukreev, Neal Beidleman, a guide in the Mountain Madness team, and Sherpas painstakingly paying out the rope on the exposed upper sections. Meanwhile, Scott Fischer was further down the mountainside. He had expended much energy three days earlier helping his friend Dale Kruse, who had become ill, down to Base Camp. He was also exhibiting symptoms consistent with high-altitude pulmonary oedema, a build-up of fluid in the lungs.

  It wasn’t until a little after 1 p.m. that Krakauer, ahead of the rest of his team, made it to the summit. He was thrilled to have fulfilled a lifetime ambition, but he could sense that the moving parts of the expedition were becoming misaligned. Hall was still well below the summit. Pitman and other team members were becoming ever more tired. The deadline for a safe turnaround time was rapidly approaching. Wispy clouds were filling the valleys to the south.

  And yet, perhaps even then, none of the climbers could have guessed that in the coming hours, eight of their number would lose their lives in one of the most infamous days in the history of the world’s most famous mountain. The 1996 Everest disaster had started.

  In the years since 1996, many survivors have told their accounts. Krakauer wrote the bestseller Into Thin Air. Beck Weathers, another client of Hall’s, wrote Left for Dead. IMAX made a documentary called Everest, while National Geographic made a feature called The Dark Side of Everest. In 2015, the disaster was made into a Hollywood blockbuster starring Jason Clarke, Josh Brolin and Keira Knightley. Everest grossed more than $200 million at the box office.

  And yet despite the plethora of accounts, there is, to this day, no consensus about what went wrong and what, by implication, should be learned. Krakauer was deeply critical of Anatoli Boukreev, a guide in the Mountain Madness team, who had advanced too far ahead of his clients. Boukreev hit back with his own book entitled The Climb, and was defended by many of the most authoritative voices in mountaineering. Pitman, who spent years haunted by what had happened, complained that various accounts had assassinated her character.8 Krakauer, for his part, said that his depiction in Everest (he was played by the actor Michael Kelly) was ‘total bull’.

  Differences of this kind are, perhaps, inevitable, particularly when there is a desire to apportion blame. People had died, families were bereaved, and many were confused as to how things had gone so badly wrong. It is common for first-person accounts to diverge in the aftermath of a disaster, sometimes profoundly. But in this chapter, we are going to look at the possibility that all of these accounts are wrong. We will examine the idea that the problem wasn’t with the actions of any individual, but in the way they communicated.

  In the opening two chapters, we examined how different perspectives can enlarge collective intelligence, often in idiosyncratic ways. Sometimes, however, the benefits of diversity are more prosaic. On a mountainside, different climbers are at different positions on the slopes and are seeing different things. A climber at one point will observe the energy levels of nearby climbers, problems in the vicinity, clouds rolling in from the west. These will not be visible to those at different points of the mountain. One person has a single pair of eyeballs. A team has many. So the question we are going to ask is: how are useful information and perspectives combined? For diversity to work its magic, different perspectives and judgements must be expressed. It is no good having useful information that never gets aired.

  There is also the question of who makes the final decision once the various perspectives have been expressed. If there are competing views, whose wins out? If there are different insights, do we fuse them together, or select one rather than the other? In this chapter, we will move from the conceptual foundations of diversity to the practical implementation.

  In many ways Everest will prove an apt vehicle for this exploration. Weather conditions are inherently uncertain. No matter how much planning and preparation you have done, there are unexpected twists and turns. The number of moving parts as conditions morph make huge demands not merely on physical endurance, but cognitive load. Mountaineering is, in this sense, what theorists call a VUCA environment: volatile, uncertain, complex and ambiguous.

  II

  Psychologists and anthropologists don’t agree about much, but one thing they do agree upon is the significance of dominance hierarchies. Humans share hierarchies with other primates and, according to the psychologist Jordan Peterson, even lobsters. ‘The presence of hierarchy stretches back across tens of thousands of generations to the advent of Homo sapiens and, indeed, much further to include other primate species,’ Jon Maner, Professor of Psychology at Florida State University, has said. ‘The human mind is, quite literally, designed to live within hierarchically arranged groups.’9

  The emotions and behaviours associated with dominance hierarchies are so deeply written into our minds that we scarcely notice they are there. Dominant individuals adopt more expansive gestures, issue threats and motivate subordinates through fear. Particularly dominant alphas raise their voices, gesticulate and bare their teeth. This is as true of many bosses in the financial district as of an alpha in a chimpanzee troop. Those in lower positions tend to signal subservience with lowered heads, hunched shoulders and gaze avoidance – what George Orwell termed ‘cringing’.

  Indeed, so highly attuned
is our status psychology that you can place five strangers in a room, give them a task, and watch dominance hierarchies developing within seconds. What is even more remarkable is that external observers, who can’t even hear what is being said, can accurately place people at the various positions in the hierarchy, just by watching their postures and expressions.

  Hierarchy is not just what we do, it is who we are.

  The pervasiveness of dominance hierarchies hints that they serve an important evolutionary purpose. When the choices that confront a tribe or group are simple, it makes sense for a leader to make decisions, and for everyone else to fall into line. This boosts speed and coordination. Tribes with dominant leaders tended to fare best in our evolutionary history.

  But in situations of complexity, dominance dynamics can have darker consequences. As we have seen, collective intelligence hinges upon the expression of diverse perspectives and insights – what we have called rebel ideas. This can shut down in a hierarchy where dissent is perceived by the alpha as a threat to their status. Dominance, in that sense, represents a paradox: humans are inherently hierarchical, and yet the associated behaviours can thwart effective communication.

  An incident that brings these ironies to light is United Airlines 173,FN2 a flight that took off from Denver on 28 December 1978, flying to Portland, Oregon. Everything went smoothly until the final approach. The captain pulled the lever to lower the landing gear, but instead of a smooth descent of the wheels, there was a loud bang, and a light that should have illuminated to show the landing gear was down and secure failed to light up. The crew couldn’t be sure that the wheels were down, so the captain put the plane in a holding pattern as they attempted to troubleshoot the problem.

  They couldn’t see below the plane to check if the wheels were down, so they conducted proxy checks. First the engineer went into the cabin. When the landing gear has slid down into place, two bolts shoot up above the wing tips. These bolts were, indeed, up. They then contacted the United Airlines Control Center in San Francisco to talk through what had happened, and received advice that the wheels were probably down.

  But the captain still wasn’t certain. What had caused that loud bang? Why hadn’t the light on the dashboard illuminated? Landing without the wheels in place can generally be achieved without loss of life, but it contains risk. The captain, a decent man with long experience, didn’t want to place his passengers in unnecessary danger. He began to wonder if the reason the light had failed to illuminate was because of the wiring. Or perhaps it was a faulty bulb.

  However, as he deliberated, and the plane continued its holding pattern, a new danger had come into play. The plane was running out of fuel. The engineer knew that the fuel was critical: he could see it disappearing on the gauge before his eyes. He also had a powerful incentive to alert the pilot: his life, and the lives of everyone else on the plane, were on the line.

  But this was the 1970s. The culture of aviation was characterised by a dominance hierarchy. The pilot was called ‘sir’. The other crew members were expected to defer to his judgements and act upon his commands. This is what sociologists call a ‘steep authority gradient’. If the engineer voiced his concerns about the fuel, it might have carried the implication that the pilot wasn’t on top of all the key information (which he wasn’t!). It might have been perceived as a threat to his status.

  By 17.46 local time, the fuel had dropped to five on the dials. This was now an emergency. Almost two hundred lives, including that of the engineer, were in severe danger. The pilot was still focused on the bulb, oblivious to the dwindling fuel. Perception had narrowed. You might suppose that the engineer might have said: ‘We have to land now! Fuel is critical!’ But he didn’t. We know from the cockpit voice recorder that he merely hinted at the problem. ‘Fifteen minutes is gonna really run us low on fuel, here,’ he said.

  The engineer was so fearful of directly challenging the captain that he softened his language. The captain interpreted his remarks as meaning that while the fuel was going to get low as they circled again it wasn’t going to run out. This was incorrect, and the engineer knew it. Even at 18.01, when it was probably too late, and with the captain now focused on the plane’s antiskid system, the engineer and first officer were still struggling to clearly state the problem.

  It wasn’t until 18.06, with the engines flaming out, that they finally made the information explicit, but it was too late. They had gone past the point of no return, not because the team lacked the information, but because it wasn’t shared. The plane crashed minutes later, piling into a wooded suburb, ploughing through a house and coming to rest upon another house. The lower left side of the fuselage was completely torn away. On a clear evening, where the airport had been visible since they entered the holding pattern, more than twenty people died, including the engineer.

  Now, this may seem like a freak event, but the psychology is universal. According to the National Transportation Board, more than thirty crashes have occurred when co-pilots have failed to speak up.10 In one wide-ranging analysis of twenty-six different studies in healthcare, it was found that a failure to speak up was ‘an important contributing factor in communication errors’.11

  This isn’t just about safety-critical industries, it is about the human mind. ‘People often think their own industries are very different’, Rhona Flin, Emeritus Professor of Applied Psychology at Aberdeen University, has said. ‘Actually, if you’re a psychologist who’s worked in different industrial settings it all looks pretty much the same . . . They’re all humans working in these technical environments. They’re affected by the same kind of emotions and social factors.’12

  In an experiment conducted not long after the Portland crash, researchers observed crews interacting in flight simulators, and the same problem kept re-emerging. ‘Captains were briefed in advance to take some bad decisions or feign incapacity – to measure how long it would take for co-pilots to speak up,’ Flin said. ‘One psychologist monitoring their responses commented, “Co-pilots would rather die than contradict a captain.” ’13

  On the surface, the willingness to risk death rather than challenge the alpha may seem odd. Certainly something that wouldn’t affect you, or me. But the failure to speak up can happen unconsciously. We do it automatically. Think of any workplace. Those in subordinate positions seek to please the boss, parroting his thoughts and even hand gestures. This eliminates diverse insights, not because they do not exist, but because they are not expressed.

  A clever study by the Rotterdam School of Management analysed more than three hundred real-world projects dating back to 1972 and found that projects led by junior managers were more likely to succeed than those with a senior person in charge.14 On the face of it, this seems astonishing. How could a team perform better when deprived of the presence of one of its most knowledgeable members?

  The reason is that this leadership comes at a sociological price when linked to a dominance dynamic. The knowledge squandered by the group when the senior manager is taken out of the project is more than compensated for by the additional knowledge expressed by the team in his absence. As Ballazs Szatmari, lead author of the study, put it: ‘The surprising thing in our findings is that high-status project leaders fail more often. I believe that this happens not despite the unconditional support they get, but actually because of it.’15

  The Indian tech entrepreneur Avinash Kaushik has an evocative phrase for the way dominance dynamics influence many organisations. He uses the acronym HiPPO: Highest Paid Person’s Opinion. ‘HiPPOs rule the world, they overrule your data, they impose their opinions on you and your company customers, they think they know best (sometimes they do), their mere presence in a meeting prevents ideas from coming up’, he said.16

  We can see a dominance dynamic in Figure 5. This is an impressively diverse team; they have plenty of coverage across the problem space. And yet when brought under a dominant leader (the dark circle), subordinates don’t say what they truly think, but what they think th
e leader wants to hear. They echo his thoughts and anticipate his feelings. There is an absence of rebel ideas.

  In effect, they start to migrate towards the alpha, parroting his viewpoint, shrinking their own bandwidth in the process. The cognitive capacity of the team effectively collapses to the parameters of just one brain, as in Figure 6. A team of rebels has – through the process of a dominance dynamic – become the social equivalent of a team of clones.

  Studies of healthcare have shown that junior members of surgical teams fail to speak up because of fear of the surgeon. And the more overbearing the surgeon the stronger the effect. Remember that leaders are often positioned not merely as powerful, but as smart. How easy for a junior member to comfort themselves with the thought that they don’t need to speak up because the leader already knows whatever they have to say? How easy when this dovetails with the demands pre-wired by the evolved psychology of subordination?

  From this perspective, the behaviour of the engineer on United Airlines 173 begins to make more sense. One can almost sense his thought processes as the fuel runs down, desperately trying to make his concerns known, constrained from doing so by the invisible influence of dominance, frantically seeking to justify his own silence, focusing his frazzled mind on the possibility that the captain already knows the state of the fuel but has come up with a solution.

  When the engineer finally made his concerns explicit, it was too late. The team had all the information it needed, but it wasn’t communicated. The dividends associated with cognitive diversity – in this case, the utterly prosaic fact of two people focusing on different aspects of a rapidly changing situation – were squandered. It led, inexorably, to disaster.

 

‹ Prev