You're It

Home > Other > You're It > Page 8
You're It Page 8

by Leonard J Marcus


  Consider this metaphor: Two groups of people are assigned the task of describing a shape inside an opaque cube. One group looks through peephole A on the side of the box. They see a triangle. The other looks through peephole B on the top of the box. They see a circle. The two groups fall into conflict about what is in the cube, and each substantiates the validity of its claims based on professed superior experience, values, intelligence, or power.

  Discovering that it is neither simply a circle nor a triangle but in fact a cone depends upon the two groups’ willingness to share and combine their different observations. As a meta-leadership tool, the Cone-in-the-Cube, of course, is a parable. In reality, there are many peepholes in the “cube” and a much more complicated collection of shapes inside, presenting numerous variations in how people of individual perspectives and dissimilar expertise perceive the same phenomenon and reach divergent conclusions.

  This is foundational thinking for meta-leaders. You expect that different people will see and understand the situation from their distinct vantage points. You encourage a meta-view by comparing, combining, and integrating these viewpoints and helping others see the bigger picture as well. The real task is acknowledging and analyzing those distinct perspectives. If it’s done right, the stakeholders recognize the complexity—they see that “there is a Cone in the Cube”—and achieve a more blended and balanced perspective. The ultimate mission of the meta-leader is to fuse those streams of thought into a common purpose and a shared narrative to achieve unity of effort among the many involved constituencies.

  People view the world through a lens crafted by their distinct experiences, expertise, allegiances, values, and objectives. Recognize that different individuals, even if they have similar backgrounds, can still observe the same phenomena and reach wildly varying conclusions. And especially if the discourse becomes polarized and adversarial, there will be scant curiosity about what others believe to be true and why. It’ll be difficult to even ask the simple question, “What do you see from your angle?” Rigidities grow when much is at stake or when someone is in a state of panic.

  Cognitive Biases

  In the “people follow you” equation of meta-leadership, your job is finding and crafting common ground. There are plenty of reasons why people cling so tenaciously to their limited or fixed perspectives when approaching a problem, even in the face of contradictory information or data. These include blatant selfish interest, self-justifying evidence, spite, and rigid personal stubbornness (“it’s all about me”). And don’t forget the more earnest reasons of conviction, passion, and well-intentioned advocacy (“it’s all about the mission”). The complexity of your situation is animated by a range of diverse and, at times, contradictory attitudes. The more nimbly you identify and work with clashing points of view, the more effectively you’ll meta-lead people toward what you hope to accomplish.

  Your brain receives a cacophony of information to process—which is perhaps even more likely to happen during a crisis. To enhance efficient processing, that information is filtered into pre-established formulas, attitudes, and beliefs embedded in your thinking. These are known as cognitive biases. You see something, and your brain reaches for a quick explanation for what is happening or what it means. Once the impression is triggered, it is hard for your brain to be convinced otherwise, despite both contrary facts and logic. Oddly enough, these filters can sway your brain to perceive just what you expect to see. Cognitive biases can be triggered by what someone is wearing (torn jeans), where they went to school (Harvard), where they work (the government), or how they speak (with a Southern accent). Your brain makes quick risk-and-reward calculations and bases its conclusions on those indicators. “Well, obviously, anyone that (blank) is a (blank).” You fill in the blanks for yourself, and you grasp how others do the same.

  Cognitive biases are formulaic pathways for efficient reasoning, and human brains cling to the comfort, ease of decision-making, and conclusions they provide. When something new comes along, the information is forced into a preconfigured pattern. And new data that do not conform to preexisting biases are rejected, serving only to reinforce the rigidity of these bias unless a correction is consciously made. For example: That person with the ripped jeans just made a brilliant suggestion. Maybe she’s smarter than I thought.

  Many times, the shortcuts provided by cognitive biases are helpful. Sometimes, however, they distort your thinking in ways that are incorrect and dangerous. A multitude of biases, shaped by culture, prior experiences, and personal preferences, affect your perceptions of situations and the decisions you make. They can both inform and blind you.

  One type of cognitive bias is confirmation bias, which causes people to elevate information that validates their existing worldview and discount anything that challenges it. After buying a car, you look for evidence that affirms that you made the right choice. Another common cognitive bias is availability bias: overweighting that which comes most easily to mind. For example, after a dramatic plane crash, many people question the safety of flying despite unequivocal evidence that it is the safest mode of travel. Self-serving bias interprets information in ways that benefit you. For example, if you give a talk and you get positive feedback from the audience, you may attribute that success to your material and your upbeat presentation style. If the feedback is not so positive, you may decide that because you were scheduled to speak right after lunch, the audience was in a “food coma.” Self-serving bias also leads you to rate the comments and opinions of your affinity group—the people in your workplace, culture, or alma mater—higher than those of others. Judgmental biases are those based on preconceived notions, including ideas about where people live, their organizational rank, or their circle of friends.

  When a cognitive bias overpowers your mental processing or perceptions, you do not recognize a novel situation as such. Emotion clouds your vision and you ignore new information as you handle the situation as though it were familiar. Instead, start by questioning your assumptions. Ask others the simple question, “Am I missing something here?” Take a moment to check yourself. Look for anomalies and teach others to do the same. We met a CEO who appointed someone to his crisis management team specifically to spot cognitive bias.

  When alarm or anger sets in, your field of vision narrows from seeing a spectrum of grays to only a stark, black-and-white set of options between good or bad, friend or enemy. Your suspicions about the unknown grow. Cognitive biases can deceive you.

  Cognitive biases put enormously powerful and sometimes overwhelming constraints on your thinking and your leadership. These narrow perspectives can inflate vulnerabilities. The more complex and alarming the problem, the more people tend toward simple and reassuring explanations. This can be true even when those perspectives defy logic and common sense.

  To address the phenomenon, modify how you deal with yourself as well as with others. If you remain aware of preconceived biases and stay vigilant in combating them, you have a better chance of distinguishing self-serving or convenient fictions from critical facts. And as you remain open to feedback from others, they are more likely to provide insights and advice on what you may be overlooking.

  The Meta-Leadership Challenge

  There are two sides to the meta-leadership challenge.

  First, there is you. Don’t be afraid to challenge the limitations that cloud your own view—your biases, experiences, and preferences. At times, leaders are so intensely focused on charging ahead, with everyone else in tow, that they fail to notice their own blinders. If considering other perspectives feels to you like a sign of weakness, you may be making yourself even more rigid and unaware. It is dangerous to be oblivious, and even more so to be unaware that you are oblivious. Avoid traps of your own making.

  Second, beyond your own limitations, marshal the patience, sensitivity, and tenacity to gently and persuasively understand the perspectives of others. This is no easy task. It will call upon all your finesse and capacity for empathy, diplomacy, and
flexibility. Remember, there are powerful reasons why people see the world the way they do. Expecting them to shift perspective, even slightly, is a tall order. Sometimes it is best to work with their cognitive biases rather than try to persuade them otherwise. In the earlier example of H1N1 policy development, Dr. Besser recognized that the presidential advisors, with their focus trained on political considerations, were afraid that shifting the narrative too quickly would make the administration appear erratic and fit the public’s confirmation bias about government incompetence. Holding fast to the science, Dr. Besser nevertheless accepted “decisions I could live with” regarding the timing of announcements and policy changes. The simple image of the Cone-in-the-Cube is a useful reminder and metaphor. It can help you and everyone else get beyond the intrapersonal and interpersonal constraints that prevent all of you from building a valuable meta-perspective on the situation at hand.

  June 8, 1999, Belgium. Thirty-three schoolchildren became ill after drinking Coca-Cola produced in Antwerp. Some were hospitalized and others reported similar symptoms a few days later. Then some eighty people in northern France reported intestinal problems after drinking Coke produced at a different plant, this one in Dunkirk, France. In total, more than 250 people were stricken. The media exploded, spreading fear of further contamination.

  The scare led to the largest product recall in the company’s history: 17 million cases of Coke across five countries. Belgium and France banned the sale of Coca-Cola products for ten days. Health ministers in Italy, Spain, and Switzerland warned their populations against consuming Coke, even though the product for sale in those countries was not produced in the suspect plants.

  At the time of the outbreak, Douglas Ivester, Coca-Cola’s CEO, was in Paris. The leadership actions he took over the next days and weeks are instructive: they illustrate common pitfalls that you may also confront when your next crisis strikes. Ivester had fashioned Coca-Cola into a highly centralized organization. All international group heads were based in Atlanta, so instead of heading to nearby Belgium upon learning the news, he returned immediately to headquarters in Atlanta. While the company sent several dozen executives to Brussels to manage the crisis, its official position was that Coca-Cola products posed no serious health risks. It took two days for the company to provide the necessary information and identify potentially tainted product. Ivester himself made no public statement for eight days. He finally traveled to Belgium ten days after the initial scare, the first of what would become several trips. Yet still he made no public appearances.

  Ivester had risen through the corporate ranks and should have been ready to handle such a crisis. He was an accomplished executive elevated to CEO by a unanimous vote of the board of directors, his promotion lauded in the press and welcomed by the markets. He understood the company, having been with Coca-Cola for twenty years before getting the top job. And he was fully familiar with European customers and governments. A decade earlier, his first operating role with the organization was as president of European operations.

  Under his direction, Coca-Cola took several proactive steps. The company set up a consumer hotline, offered to pay all medical bills related to the affected products, and launched investigations. They discovered a problem with defective carbon dioxide at the plant in Belgium. In France, they found that a wood preservative used on shipping pallets may have contaminated the outside of the cans. In the thinking of Coca-Cola officials, however, neither of these problems would have created a serious health risk. They went about making their case using evidence they found compelling. They publicized the results of tests and other health data that absolved their products. Eventually, the company and Ivester apologized, even though he continued to believe that his company bore only part of the blame. The rest, in his view, was not his problem.

  Despite the evidence presented by the company, Coca-Cola’s stock dropped by 10 percent. Sales fell worldwide, and their competitors gained ground. Half a year later, on December 5, 1999, Ivester resigned as CEO. This incident was but one of several missteps that led to his descent.

  Ivester’s error wasn’t lack of concern or unwillingness to take action. It was misreading the situation because he viewed it through a very narrow lens. In the end, seeing only a technical product quality problem, he failed to see the Cone-in-the-Cube. He took steps to diagnose and rectify that problem. When the tests did not correlate the problem to the evidence, he shared those results. He was firmly convinced that everyone else would concur when they saw the same evidence.

  What Ivester didn’t understand was that consumers and governments were going to have a different view of the problem than he did. They saw sick schoolchildren. They were worried, suspicious, and perplexed, and the problem they saw was as much emotional as technical. The public needed empathy more than evidence. In his mind, Ivester had solved the problem—though only as he understood it. He did not solve their problem—the public’s problem and the problem of consumers of his company’s products. He thereby fostered suspicion of his leadership and his product. He may have answered his own question, though he did not unravel the bigger conundrum. And that was the one that mattered.

  When you confront a major problem or crisis, consider how each stakeholder views the situation. Your task as meta-leader is to discover and respond to that bigger picture, fusing different perspectives into solutions that work across your range of critical constituencies.

  The Meta-Leadership Brain and Problem-Solving

  You may well take your brain for granted. Information and experience are inserted and stored in the gray matter. You learn tasks and processes and, with time and repetition, come to perform them reliably. You order information into logical and explainable patterns that elicit clarity and believe with confidence that you can anticipate what will happen next. You presume that you are in control of your brain and assume that others are in control of theirs. It all makes sense because that’s what your brain is supposed to do: make sense of things.

  It doesn’t always work that way.

  The brain is a mysterious organ. We actually know and consciously control much less of our brain than we realize. Chemicals and hormones regulate our emotions in ways we can barely sense. There is much that we cannot recognize or comprehend.

  Clinical studies have shown that the brain has a narrow focus: its most important function is keeping you alive, and second-most important is perpetuating your genes through your offspring. It perceives only what it thinks it needs to see lest it be overwhelmed by sensory overload. People assume that they know and understand more than they do. Even memories have been shown to be malleable, subject to change over time. False memories can become embedded as accurate recollections.

  A dramatic example comes from the Innocence Project at the Cardozo School of Law. They report that eyewitness misidentification played a role in more than 70 percent of convictions overturned through DNA testing. Misconceptions themselves limit what you can know and understand.

  Why is this so? You are an ambitious and accomplished individual. You consider yourself to be pretty smart. Your greatest and most valuable asset is your brain. Remember how you scored on that high school test? How you hammered that essay in college? Who is to tell you that you don’t know and control your own brain?

  Having an appreciation for what you do not know and understand is actually an important step in building broader knowledge and deeper understanding. It is the motivation for curiosity and the springboard for imagination. You become open to thoughts you might otherwise miss. You seek evidence, data, and facts from sources you know to be reliable even if they run contrary to your assumptions and views.

  Why can you see more by appreciating what you are not seeing? Because in doing so, you transcend the misconception that what you see is all there is to be seen. You abandon the notion that you and you alone “get it.” With a deeper appreciation for your limitations, things previously unseen become more visible. You begin to pay greater attention to others who may alert yo
u to what you have been missing. You remove your own blinders.

  Think about this contradiction, as strange as it might be: the power of your brain increases the more you acknowledge its limitations. This is a vital insight for your budding meta-leadership.

  The understanding of how the brain works is still relatively new. It is known that the brain grows in both structure and function as a result of experience. It is not a rigid organ. The brain changes over time through a phenomenon called neuroplasticity, which accounts for the ability of some stroke victims to regain speech even though the part of the brain that typically controls this function has been damaged. The brain adapts to better accomplish what you ask it to do. The more you “train your brain” to think about multidimensional, Cone-in-the-Cube problems, the more readily it will discern them going forward. The more you practice the meta-leadership mind-set, the more readily you will be able to draw upon it.

  The Cone-in-the-Cube is a tool to encourage this wider thinking about problem assessment and solution-building. It encourages robust analysis through discovery of multiple perspectives. In its most basic form, the Cone-in-the-Cube helps you compare and contrast linear thinking and complex adaptive systems thinking, as discussed in the prior chapter.

  At times, you see a simple linear problem best addressed by a simple linear solution. We call this a “duct tape problem.” Apply the fix and move on. Don’t complicate matters by making them more difficult than necessary.

  At other times, the wide meta-view will help you discern the complexity of what you face. Embedding the Cone-in-the-Cube into your thinking trains your brain to understand the complexity and to overcome the bias that inclines it toward linear thinking. You will see beyond the parts to grasp the whole. Once you see the larger, integrated picture, your job is to communicate it so that others can appreciate it as well. This is meta-leading, complex problem-solving.

 

‹ Prev