Farsighted

Home > Other > Farsighted > Page 2
Farsighted Page 2

by Steven Johnson


  In the Affair of so much Importance to you, wherein you ask my Advice, I cannot for want of sufficient Premises, advise you what to determine, but if you please I will tell you how.

  When these difficult Cases occur, they are difficult chiefly because while we have them under Consideration all the Reasons pro and con are not present to the Mind at the same time; but sometimes one Set present themselves, and at other times another, the first being out of Sight. Hence the various Purposes or Inclinations that alternately prevail, and the Uncertainty that perplexes us.

  To get over this, my Way is, to divide half a Sheet of Paper by a Line into two Columns, writing over the one Pro, and over the other Con. Then during three or four Days Consideration I put down under the different Heads short Hints of the different Motives that at different Times occur to me for or against the Measure. When I have thus got them all together in one View, I endeavour to estimate their respective Weights; and where I find two, one on each side, that seem equal, I strike them both out: If I find a Reason pro equal to some two Reasons con, I strike out the three. If I judge some two Reasons con equal to some three Reasons pro, I strike out the five; and thus proceeding I find at length where the Ballance lies; and if after a Day or two of farther Consideration nothing new that is of Importance occurs on either side, I come to a Determination accordingly.

  And tho’ the Weight of Reasons cannot be taken with the Precision of Algebraic Quantities, yet when each is thus considered separately and comparatively, and the whole lies before me, I think I can judge better, and am less likely to take a rash Step; and in fact I have found great Advantage from this kind of Equation, in what may be called Moral or Prudential Algebra.

  Like most pros-vs.-cons notepad sketches since, Darwin’s “Marry/Not Marry” litany did not appear to utilize all the complexity of Franklin’s “moral algebra.” Franklin used a primitive but still powerful technique of “weighting,” acknowledging that some arguments will inevitably be more meaningful than others. In Franklin’s approach, the “Ballancing” stage is just as important as the initial stage of writing down entries in each column. But it seems likely that Darwin intuitively calculated the respective weights, presumably deciding that having children might in the long run matter more to him than the “conversation of clever men in clubs.” In terms of simple arithmetic, there were five more entries on the “con” side of Darwin’s dilemma, and yet the moral algebra in his head appears to have led to an overwhelming decision on the side of marriage.

  Most of us, I suspect, have jotted down pros-vs.-cons lists at various crossroads in our personal or professional lives. (I remember my father teaching me the method on a yellow legal pad sometime in my grade-school years.) Yet Franklin’s balancing act—crossing out arguments of corresponding weight—has largely been lost to history. In its simplest form, a pros-vs.-cons list is usually just a question of tallying up the arguments and determining which column is longer. But whether you integrate Franklin’s more advanced techniques or not, the pros-vs.-cons list remains one of the only techniques for adjudicating a complex decision that is regularly taught. For many of us, the “science” of making hard choices has been stagnant for two centuries.

  DELIBERATING

  Think back to a decision you made along the lines of Darwin’s or Priestley’s. Perhaps it was that time you weighed leaving a comfortable but boring job for a more exciting but less predictable start-up; or the time you wrestled with undergoing a medical procedure that had a complicated mix of risk and reward. Or think of a decision you made that belonged to the public sphere: casting a vote in the Brexit referendum, say, or debating whether to hire a new principal as part of your responsibilities on a school board. Did you have a technique for making that decision? Or did it simply evolve as a series of informal conversations and background mulling? I suspect most of us would say the latter; at best, our techniques would not be all that different from Darwin’s jotting down notes in two columns on a piece of paper and tallying up the results.

  The craft of making farsighted choices—decisions that require long periods of deliberation, decisions whose consequences might last for years, if not centuries, as in the case of Collect Pond—is a strangely underappreciated skill. Think about the long list of skills we teach high school students: how to factor quadratic equations; how to diagram the cell cycle; how to write a good topic sentence. Or we teach skills with a more vocational goal: computer programming, or some kind of mechanical expertise. Yet you will almost never see a course devoted to the art and science of decision-making, despite the fact that the ability to make informed and creative decisions is a skill that applies to every aspect of our lives: our work environments; our domestic roles as parents or family members; our civic lives as voters, activists, or elected officials; and our economic existence managing our monthly budget or planning for retirement.

  Ironically, in recent years, we have seen a surge in popular books about decision-making, but most of them have focused on a very different kind of decision: the flash judgments and gut impressions profiled in books like Blink and How We Decide, many of them building on the pioneering research into the emotional brain associated with scientists like Antonio Damasio and Joseph LeDoux. Daniel Kahneman’s brilliant Thinking, Fast and Slow introduced the notion of the brain as divided between two distinct systems, both of which are implicated in the decision-making process. System 1 is the intuitive, fast-acting, emotionally charged part of the brain; System 2 is what we call on when we have to consciously think through a situation. These are undeniably powerful categories in thinking about thinking, but Kahneman’s work—much of it a collaboration with the late Amos Tversky—has largely focused on the idiosyncrasies and irrationalities of System 1. This new model of the brain is helpful in understanding all sorts of dysfunctions, small and large, that plague us in the modern world. We have learned how our brains can be manipulated by credit card schemes and predatory mortgage lenders; we’ve learned why we choose certain brands over others, and why we sometimes fall prey to misleading first impressions in deciding whether to trust someone we’ve just met. But if you read through the clinical research, most of the experiments behind the science tend to sound something like this:

  Problem 1: Which do you choose? Get $900 for sure OR 90 percent chance to get $1,000

  Problem 2: Which do you choose? Lose $900 for sure OR 90 percent chance to lose $1,000

  Problem 3: In addition to whatever you own, you have been given $1,000. You are now asked to choose one of these options: 50 percent chance to win $1,000 OR get $500 for sure

  Problem 4: In addition to whatever you own, you have been given $2,000. You are now asked to choose one of these options: 50 percent chance to lose $1,000 OR lose $500 for sure

  You could fill an entire book with examples of this kind of experiment, and the results that these studies have generated have indeed been eye-opening and sometimes counterintuitive. But as you read through the studies, you start to notice a recurring absence: none of the choices being presented to the experimental subjects look anything like the decision to bury Collect Pond or Priestley’s choice to take on a new patron. Instead, the decisions almost invariably take the form of little puzzles, closer to the choices you make at a blackjack table than the kind of choice that Darwin was contemplating in his notebook. Fields like behavioral economics have been built on the foundation of these abstract experiments, where scientists ask their subjects to wager on a few arbitrary outcomes, each with different probabilities attached to them. There’s a reason why so many of the questions take this form: these are precisely the kinds of decisions that can be tested in a lab.

  But when we look back at the trajectory of our lives, and of history itself, I think most of us would agree that the decisions that ultimately matter the most do not—or at least should not—rely heavily on instincts and intuition to do their calculations. They’re decisions that require slow thinking, not fast. While they are no doubt influence
d by the emotional shortcuts of our gut reactions, they rely on deliberative thought, not instant responses. We take time in making them, precisely because they involve complex problems with multiple variables. Those properties necessarily make the logical and emotional networks behind these decisions more opaque to the researchers, given the obvious ethical and practical limitations that make it challenging for scientists to study choices of this magnitude. Asking someone to choose one candy bar over another is easy enough to do in the lab; asking someone to decide whether or not to marry is quite a bit more difficult to engineer.

  But that does not mean the tools available to us in making hard choices have not improved mightily since Priestley’s day. Most of the important research in this multidisciplinary field has been conducted on small- to medium-sized group decisions: a team of business colleagues debating whether to launch a new product; a group of military advisors weighing different options for an invasion; a community board trying to decide on the proper guidelines for development in a gentrifying neighborhood; a jury determining the guilt or innocence of a fellow citizen. For good reason, these sorts of decisions are formally described as “deliberative” decisions. When we first encounter the accused burglar on a jury trial, we may well have an instinctive response of guilt or innocence that comes to us through a quick assessment of his or her demeanor or facial expression, or through our own preexisting attitudes toward crime and law enforcement. But systems engineered to promote deliberative decision-making are specifically designed to keep us from naively falling into those preconceived assumptions, precisely because they are not likely to steer us toward the correct decision. We need time to deliberate, to weigh the options, to listen to different points of view before we render a judgment.

  We don’t need to rely exclusively on social psychology experiments to cultivate our decision-making skills. Recent history abounds with case studies where complex decisions were made by groups of people who consciously adopted strategies and routines designed to produce more farsighted results. We have a lot to learn from studying those decisions, both because we can apply those techniques to our own choices and because we can use that knowledge to evaluate the decision-making skills of our leaders and colleagues and peers. You almost never hear a political debate—or a shareholder meeting—where one of the candidates or executives is asked how he or she goes about making a decision, but in the end, there may be no more valuable skill for someone in any kind of leadership position. Courage, charisma, intelligence—all the usual attributes we judge when we consider voting for someone pale in comparison to the one fundamental question: Will he or she make good choices when confronted with a complex situation? Intelligence or confidence or intuition can only take us so far when we reach one of those difficult crossroads. In a sense, individual attributes are not sufficient. What a “decider”—to use George W. Bush’s much-mocked term—needs in those circumstances is not a talent for decision-making. Instead, what he or she needs is a routine or a practice—a specific set of steps for confronting the problem, exploring its unique properties, weighing the options.

  It turns out that there is great drama and intensity in watching a group of minds wrestling with a complex decision. (Some of the most powerful passages in literature capture this experience, as we will see.) But that slower, more contemplative narrative is often overshadowed by more abrupt events: a fiery speech, a military invasion, a dramatic product launch. We tend to fast-forward to the outcomes of complex decisions, skipping over the journey that led to them. But sometimes, when it matters most, we need to rewind the tape.

  THE COMPOUND

  In August 2010, the Pakistani courier Ibrahim Saeed Ahmed—also known as “al-Kuwaiti,” among other aliases—drove two hours east from the arid valley city of Peshawar up into the Sarban Hills where the city of Abbottabad is located. With known ties to Osama bin Laden and a number of other high-level al-Qaeda operatives, al-Kuwaiti had been a person of interest for the CIA for several years. A Pakistani asset gathering intelligence for the CIA had identified al-Kuwaiti’s white Suzuki jeep in Peshawar and followed him without discovery all the way to a suburb at the outskirts of Abbottabad, a journey that ended on a dirt road leading to a ramshackle compound surrounded by fifteen-foot-high concrete walls topped with barbed wire. After al-Kuwaiti entered the compound, the Pakistani operative sent word back to the CIA that his target had been welcomed into a building that appeared to possess more elaborate security than other houses in the neighborhood. Something seemed suspicious about the setting.

  That deft act of surveillance set in motion a chain of events that would eventually lead to the legendary May 2011 raid and the death of Osama bin Laden, who had managed to live in the compound in relative comfort—certainly compared to the cave-dwelling that many suspected he had resorted to—for almost five years. The story of the attack on bin Laden’s unlikely residence—with Black Hawk helicopters descending on the compound in the early morning hours—has been widely covered as a brilliantly executed military operation, and a resilient one, in that it survived what could easily have been a catastrophic failure when one of the helicopters crashed while attempting to hover over the compound’s interior. The actions taken that night tell a story about bravery, near-flawless teamwork, and quick thinking under almost unimaginable pressure. Not surprisingly, it has been the subject of blockbuster Hollywood films and high-profile TV documentaries, as well as a number of bestselling books.

  But the wider story behind the raid—not just the actions taken that night, but the nine months of debate and deliberation that resulted in the Abbottabad attack—helps explain why the talent for making hard choices has been generally neglected in our schools and the wider culture. We have a tendency to emphasize the results of good decisions and not the process that led to the decision itself. The Abbottabad raid was a triumph of military institutions like the Navy SEALs and the satellite technology that enabled them to analyze the compound with enough precision to plan the attack. But beneath all that spectacular force and daring, a slower and less headline-worthy process had made the raid possible in the first place, a process that explicitly drew on our new understanding about how to make hard choices. The technology deployed to track down bin Laden was state-of-the-art, from the satellites to the Black Hawks. But so was the decision-making. The irony is, most of us ordinary civilians have almost nothing to learn from the story of the raid itself. But we have everything to learn from the decision process that set it in motion. The vast majority of us will never have to land a helicopter in a small courtyard under the cover of darkness. But all of us will confront challenging decisions in our lives, the outcomes of which can be improved by learning from the internal deliberations that led to the killing of bin Laden.

  When news first reached the CIA’s headquarters at Langley that their operative had tracked al-Kuwaiti to a mysterious compound on the outskirts of Abbottabad, almost no one in the CIA suspected they had stumbled across Osama bin Laden’s actual hideout. The consensus was that bin Laden was living in some remote region, not unlike the caves outside Tora Bora where US forces had nearly captured him eight years before. The compound itself was situated less than a mile from the Pakistan Military Academy; many of bin Laden’s neighbors were members of the Pakistan military. Pakistan was supposed to be our ally in the war on terror. The idea that the man who had engineered the 9/11 plot might be living in the middle of a Pakistan military community seemed preposterous.

  But early reconnaissance on the compound only heightened the mystery. The CIA quickly determined that the compound had no phone lines or Internet, and the residents burned their own trash. Al-Kuwaiti’s presence suggested that the building had some connection to al-Qaeda, but the construction costs alone—estimated at more than $200,000—were puzzling: Why would the cash-starved terror network spend so much money on a building in suburban Abbottabad? According to Peter Bergen’s account of the hunt for bin Laden, CIA chief Leon Panetta was briefed about al
-Kuwaiti’s visit in August 2010. Officials described the compound—somewhat aggressively—as a “fortress.” The word caught Panetta’s attention, and he ordered the officials to pursue “every possible operation avenue” to discover who was living behind those concrete walls.

  The decision process that led to the killing of Osama bin Laden was, ultimately, a sequence of two very different kinds of decisions. The first took the form of a mystery: the CIA had to decide who was living inside the enigmatic compound. The second decision arose once they had reached reasonable certainty that the structure housed al-Qaeda’s leader: how to get into the compound and either capture or kill bin Laden, assuming the first decision had been correct. The first decision was epistemological: How can we know for certain the identity of the people living in this building on the other side of the planet? Making the decision involved a kind of detective work: piecing together clues from a wide range of sources. The second decision revolved around actions and their consequences: If we simply flatten the compound with a B-2 bombing run, will we ever know for sure that bin Laden was on the premises? If we send in a special ops team to extract him, what happens if they run into trouble on the ground? And even if they’re successful, should they attempt to capture bin Laden alive?

  As it happened, each of these decisions was shadowed by a similar decision in the past that had gone horribly wrong. The Bush administration had wrestled with a comparable epistemological decision—does Saddam Hussein possess weapons of mass destruction?—with disastrous consequences eight years before. And the decision to launch the raid on the compound had echoes both of Jimmy Carter’s failed helicopter rescue of the Iranian hostages and John F. Kennedy’s botched Bay of Pigs invasion. These decisions had been made by smart people working in good faith to make the correct call. The decisions were deliberated on for months, and yet they ended in catastrophic failure. In a sense you can see the ultimate triumph of the bin Laden raid as a rare example of an institution learning from its mistakes by deliberately improving the process that had led to those past mistakes.

 

‹ Prev