Labyrinth- the Art of Decision-Making
Page 27
Let’s return, though, to the present, where, with all due regard to the intellectual scope of Ray Kurzweil’s imagination, it’s still arguable in which form such long-term prospects are likely to manifest. I believe that the really smart thing is to closely monitor the development of the key meta-trends (as they are less vulnerable to being overturned or destroyed by black swans), and avoid turkey syndrome and being too taken in by suspiciously precise predictions uttered by various silver-tongued orators. And let’s hope that all this effort to support us does not lead to the total automation of the decision-making process. To err is human, and many wonderful innovations have emerged as a result of this “weakness.” Thanks to gross negligence by scientists, we have acquired, among other things, penicillin (how lucky we are that Alexander Fleming was untidy) and Viagra (a side effect of developing a medicine designated UK92480, for people suffering from angina). Thanks to simple human error, William Roentgen discovered X-rays, the artificial sweetener saccharin was discovered by accident, the first fireworks were accidentally set off, the ability of microwaves to cook food was discovered unwittingly, the glue on Post-it notes was a failed super glue... and so on.
And then there are the decisions that we can’t evaluate using a cold, binary logic, because while they might be the right choices, they may have little to do with rationalism.
There is a most beguiling scene about making choices in Mikhail Bulgakov’s fabulous novel The Master and Margarita. The eponymous heroine is having a difficult time. Her love, the Master, has been placed in a psychiatric hospital and subsequently become depressed and lost his inspiration for his work on a biography of Pontius Pilate. Meanwhile, on Earth (more precisely in Moscow), Woland, a.k.a. Satan, has appeared with his retinue, intending to celebrate his annual ball of the dead. The tradition is that Woland must be served by a local woman named Margarita who will be granted a wish after the ball. As you might guess, she agrees, and, after an extraordinary night stands once more before Woland, who awaits her wish.
“And so, Margot,” Woland went on, softening his voice, “what do you want for having been my hostess tonight? [... ] Rouse your fantasy, spur it on!” [... ]
Silence ensued, interrupted by Koroviev, who started whispering in Margarita’s ear: “Diamond donna [... ] I advise you to be reasonable! Or else fortune may slip away.”
“I want my beloved master to be returned to me right now, this second,” said Margarita. 10
Margarita’s choice of wish is rational only from her perspective, and also perhaps the Master’s; for anyone else, including Woland’s retinue, it is completely incomprehensible. I do hope, though, that despite the predictions of futurologists and science fiction authors, there will still be room in the future for making choices that might seem pointless to others but are the right ones for us, whatever any super-advanced fifteenth-generation quantum computer might think of them.
Epilogue
Our odyssey through the twists and turns of decision-making began with reflections on a movie about a mad world, so it seems apt to end with a movie.
Producers Jerry Bruckheimer and Don Simpson have teamed up many times with director Tony Scott, and each time the results were better than average (certainly in terms of profit). Who doesn’t remember Top Gun from the mid-eighties? Tom Cruise, Val Kilmer, Anthony Edwards, and the then little-known Tim Robbins sat behind the joystick of an F-14 as cadets at the elite United States Air Force Fighter Weapons School, with Tom Skerritt and Michael Ironside in the roles of their instructors. 1 A few years later, the production and direction trio went from sky to sea, resulting in the excellent Crimson Tide, which hit movie theaters in 1995. The
action takes place onboard the nuclear-weapon–carrying submarine USS Alabama, commanded by Captain Frank Ramsey (played brilliantly by Gene Hackman), and the first officer, second in command, Lt. Cmdr. Ron Hunter (played equally brilliantly by Denzel Washington).
Ron Hunter is a highly educated officer, but somewhat green in combat conditions. His intelligence and expertise enable him to assess the state of play and analyze matters very broadly, something he is keen to do. Frank Ramsey, on the other hand, believes in total obedience and deference to the hierarchy. For him, following procedures and giving clear commands to the crew are sacrosanct. His conviction is reinforced by many years of combat experience and successful missions. The men’s disparate perspectives rapidly descend into conflict. During a rocket-launching procedure, Captain Ramsey gives an order that Hunter is supposed to immediately repeat out loud without thinking, a double command being a formal requirement for a launch. Hunter, though, asks if the decision is correct, thus irritating Ramsey. Later, in private conversation, he explains the reasons for his anger to his first officer, at the same time clearly setting out his principles and values, which are essentially that those in command should present a united front at all times and not be seen questioning each other. Ramsey makes his views on the subject crystal clear: “We’re here to preserve democracy, not to practice it.” 2
Later still, the situation becomes much more threatening. The plot isn’t complicated—it resembles a somewhat improved and more realistic version of Top Gun. Russia is engulfed by civil war and the ultra-nationalist Vladimir Radchenko, backed by military units loyal to him, seizes a nuclear missile installation. The rebels make demands and the situation escalates, threatening nuclear war with the United States. As you might have guessed, the USS Alabama is sent to an area near the Russian border with the aim of eliminating Radchenko and the threat to the USA. The Americans don’t want to attack prematurely, so they wait for a signal that the rebels are genuinely preparing to launch missiles. When military satellites register the rockets being fueled, it can only mean one thing: Radchenko has ordered an attack. The captain of the USS Alabama receives an order to carry out a preemptive strike and launch ten Trident missiles at the nuclear base held by the rebels. But during launch preparations, the Alabama comes under attack by a Russian Akula-class submarine and is forced to hide in the depths, interrupting the launch procedure. While the Alabama is diving, it receives another order, but because of a sudden communications problem caused by the attack, the message is incomplete—it contains only the words “nuclear missile strike” and no solid, useable information. This leads to a heated exchange between the commander and his first officer, with each interpreting the situation differently. In Ramsey’s view, in accordance with procedure, the last complete message is the binding one, and so they should again try to launch the missiles. The first officer, aware that the command center wanted to pass on new information, possibly amending the earlier order, demands that they surface and make contact with HQ in order to verify the order. The other officers on the bridge witness the ensuing argument between Ramsey and Hunter, with Hunter insisting that his superior take time to assess the situation rather than blindly following protocol. Frustrated, Hunter draws Ramsey’s attention to a key aspect of that protocol:
Captain Ramsey, under operating procedures governing the release of nuclear weapons we cannot launch our missiles unless both you and I agree! Now this is more than formality, sir. This is why your command must be repeated! It requires my assent! I do not give it, and, furthermore, if you continue upon this course and insist upon this launch without confirming this message first... 3
A furious Ramsey threatens to arrest Hunter for mutiny—only to find Hunter quoting from the regulations and effectively ordering Ramsey’s own arrest.
Captain Ramsey never expected the procedure to be used in such a manner. Arrested by the COB (Chief of the Boat), he’s led to his cabin and locked in. Hunter and the COB are extremely uneasy about how matters have unfolded, not least because they aren’t convinced that Ramsey’s orders were actually wrong.
As you’ve probably guessed, Ramsey, as an old salt, has no intention of giving in, but for those who haven’t seen it (you should!), I won’t issue a spoiler. We can, however, ask which of the protago
nists in this case was right. The insubordinate Hunter, or the experienced captain, who acted precisely in accordance with the rules? Even if, in the final analysis, the first officer made the right decision in halting the attack, shouldn’t breaking protocol and refusing to obey a superior officer have some consequences? What if the rebels had launched an attack while Hunter and Ramsey argued about procedures? 4
Crimson Tide is just a movie about a fictional event, but when we consider the potential consequences of the wrong decision, it’s enough to make your blood run cold. The terrifying fact is that a highly similar incident did actually take place.
At the beginning of the 1980s, there was a sharp downturn in relations between the superpowers. The reason for this was the hard line adopted by President Reagan, symbolized by the launching of the Strategic Defense Initiative (SDI), later referred to as Star Wars, coupled with the ongoing struggle for power within the Central Committee of the Communist party following the death of Leonid Brezhnev on November 10, 1982. These power plays were bookended by a number of other dangerous events: the outbreak of the war in Afghanistan, the deployment of SS-20 missiles by the Soviets, operation RJAN (a Soviet simulation of full readiness to launch nuclear missiles), and the USA’s boycott of the 1980 Olympics in Moscow. On top of all this, on September 1, 1983, a Soviet Su-15 fighter shot down a Korean Air Lines Boeing 747 that had entered Soviet airspace near Kamchatka during a flight from Anchorage to Seoul. All 269 people on board were killed. Tensions reached a peak, and the prospect of nuclear war was increasingly likely, with every move of the enemy being closely observed.
The early warning system in the Soviet Union at that time was based on a network of military satellites known as Oko, which passed on data to the Serpukhov-15 monitoring station to the south of Moscow. One of the officers serving at Serpukhov was Stanislav Yevgrafovich Petrov, a forty-four-year-old Lt. Col. in the USSR Air Force. On September 26, 1983, at precisely four minutes past midnight, he saw a message on his screen that he had hoped never to see: the system told him a ballistic rocket was being launched from the Malmstrom Air Force Base in Montana. A minute later, another alarm went off in the command room, alerting Petrov to the launch of four more missiles from the same base.
Procedure clearly set out what actions Petrov should take. The first step was to inform the First Secretary at the time, Yuri Andropov, of a nuclear assault. Andropov was co-creator of Operation RJAN, and utterly convinced of the aggressive intentions of his country’s adversary. Petrov was perfectly aware of what the consequences of passing on a sparse, terse message to the ailing former KGB boss would be: global nuclear war. So the colonel took an unusually bold decision: assessing the situation, he decided to disregard the procedure and not inform the authorities. He assumed that no one would initiate World War III by launching just five missiles, so the early warning system must be the result of a fault or error of some kind. Of course, he couldn’t be certain, and like Ron Hunter in Crimson Tide, he went through twenty-five minutes of hell, not knowing if the American warheads would hit. If his decision to flout the rules had been wrong, the Soviet Union would have ceased to exist.
Ultimately, it turned out that the signal about the missile launch was a false alarm; the poorly designed satellite early warning system had interpreted the sun reflecting off a high layer of cloud as a sign of a missile launch and triggered the alert. If at that time in Serpukhov the equivalent of Captain Frank Ramsey had been perched in front of the monitors, you wouldn’t be reading this book now. Fortunately, it was Stanislav Petrov sitting there, a man who had the courage to think for himself and most likely saved the world from nuclear destruction. 5
Both these cases, fictional and true, have a common denominator—the justified breaking of established procedure. It is a classic dilemma that we often encounter in the business world. So why do we bother setting up decision-making procedures? To what end do we create management algorithms? What should we do in a black swan situation (e.g., receiving an incomplete order because of a communications breakdown)? Aren’t procedures created for just such situations, guaranteeing a predictable response from a manager, team, or organization? Or is it precisely in those circumstances that we should abandon those procedures because they were created in ignorance of the ongoing situation? Do we need fixed rules, or an individual assessment of the situation?
Well-defined, regularly updated procedures provide several measurable benefits. First, they increase the likelihood of the right decision being made by an employee, because they’re based on the accumulated experience of many individuals and have been repeatedly tested. In this way, we reduce the risk of poor decisions being made by a relatively inexperienced person. Second, procedures increase the predictability of an organization, which is impor-tant, not only from the management perspective, but also from the perspective of other employees (we know what to expect in any situation). Third, operating procedures can speed up responses in certain situations, as things happen along tried and tested lines. Fourth, in many cases, procedures provide an extra level of safety in key company processes, like the routine preflight checks before a plane takes off. In this sense, procedures guarantee that no fundamental issues will be ignored in any given instance.
So, it might seem that there is no simpler recipe for making the right decisions outside of a well-defined, comprehensively tested, and consistently improved set of procedures, with clearly identified roles, rights, and responsibilities assigned to individual employees. Unfortunately, the two situations described above show that there are situations in which even the best possible procedures turn out to be a poor advisor—just following the rules could lead to a nuclear catastrophe. So, the situation in which procedures might let you down are precisely those of a black swan. This shouldn’t come as a surprise, seeing as they can also be called “events outside procedure” and are something for which we are entirely unprepared (so any established rules can’t take into account the new realities). At that point, the human aspect becomes crucial. Bombarded by information, we are subject to emotions and exposed to all kinds of traps set up by our minds.
Such situations are a particular challenge to leaders.
The model of the single charismatic leader is now falling into disuse and though many of us still admire such visionaries as Steve Jobs, Richard Branson, Jack Welch, and Elon Musk, investors’ attention is drawn more and more to highly effective and profitable firms led by—are you ready for this?—decidedly uncharismatic people. It turns out that the key to success lies in building great teams and shaping the organizational culture in such a way that many people display leadership, not just the chairperson or managing director. During one of the Harvard Business Review conferences I spoke at in Poland, I named this the “model of dispersed leadership.”
Dispersed leadership means helping employees to develop certain habits and attitudes, encouraging them to be aware of changes taking place in the business world, to observe the world around them through a critical lens, and to feel able to speak about trends they’re observing that could have an impact on the business. This entails empowering employees to assume a level of responsibility and make decisions that correlate with their level of authority. It also means understanding the influence that different people have on the quality of decision-making. Developing such attitudes in employees means that in any company, there will be dozens, hundreds, even thousands of “daily leaders,” who operate just like the major decision-makers described elsewhere in this book. This will give any company the biggest gain it could wish for—minimized business risk. You might think that encouraging people to make independent decisions increases risk rather than reducing it, and it’s true that in a dispersed leadership model the number of minor errors made by employees in good faith increases. However, it radically reduces the strategic risk of failing to spot a black swan on the horizon, and of making a decision that would be wrong for the organization as a whole. Think back to the story of Encyclopædia Br
itannica: an employee there spotted the approaching black swan (Wikipedia), recognized the threat, and told his superiors. However, they ignored the signal and just carried on operating in their tried and tested way. It wasn’t long before they realized they’d made a terrible mistake. It’s worth remembering, then, that while the predictability of employees’ behaviors falls in a dispersed model, and the number of minor errors increases, the quality of the most important decisions improves. The enormous benefit of reducing strategic risk always outweighs the harm caused by smaller errors. Dispersed leadership counterbalances charismatic leadership, enabling us to avoid its central weakness—bad decisions made by a leader whom no one questions.
This isn’t just a theory, as the world of aviation discovered.
In the 1970s, there were a number of catastrophes that shared a common denominator: flight crew rarely doubted the wisdom of a pilot’s decision, and even if they did have doubts, they were too afraid to confront the pilot with their concerns. The rigorous recruitment procedures and training, as well as the requirement to accumulate experience amounting to thousands of flying hours logged, meant that passenger flight pilots had achieved almost god-like status among other employees. During a flight, the captain was the absolute authority and his (they were mostly men at that time) decisions were never to be questioned. The copilot, flight engineer, head of the cabin crew, and flight attendants simply did their jobs and obeyed the captain’s orders. The system worked as long as the captain made the right choices. When he was wrong, the remaining crew members were in no position to oppose him—or even to express their opinions or concerns.
The air accident commission investigation into the biggest aviation disaster in history, which killed 583 people on Tenerife in 1977, revealed that the catastrophe was in large part caused by this flawed system. On March 27, a few minutes after midnight, a bomb went off in the terminal buildings of Gran Canaria Airport, the islands’ main airport, injuring several people. Following this incident, most flights were redirected to Los Rodeos airport on the neighboring island of Tenerife. During the day, air traffic control at Los Rodeos buckled under the pressure of the extraordinarily crowded airspace and runway aprons, where plane after plane was landing. The situation was made worse by difficult weather conditions, including thick fog, as well as the fact that, as it was Sunday, there were only two controllers on duty to cope with the massively increased air traffic. Shortly after 5:00 pm, a catastrophic collision of two Boeing 747s occurred. A KLM plane that was taking off smashed into a Pan Am airliner at almost 125 miles per hour, killing everyone on board the Dutch jumbo jet (248 people) as well as 335 passengers and crew on the Pan Am aircraft; 61 people survived.