Book Read Free

Influence in Action

Page 18

by Craig Weber


  What About Gut Feelings, Hunches, or Intuition?

  People often say: “You should keep your emotions out of the conversation,” or, “Before the meeting begins, remember to leave your emotions at the door.” This is bad advice for two reasons:

  • First, it’s impossible to do; you can’t just turn your emotions on and off at will.

  • Second, it’s counterproductive. Research shows that emotions are essential when it comes to making decisions. Without them you have no preferences.4

  Research also suggests that you should particularly pay attention to your gut feelings when you have a lot of experience with the subject matter. They’re often the product of tacit knowledge.5

  So here is more practical advice: Rather than attempt to keep your emotions out of a conversation, work instead to prevent your emotions from running away with the conversation. Don’t negate your feelings; treat them responsibly. How do you do this? Rather than suppress your feelings, intuitions, or hunches, treat them, instead, like hypotheses by explaining and testing them:

  • My gut is telling me this is a really bad decision, but I can’t quite find an explanation for why I feel this way. Is anyone else feeling the same way, and, if so, are you any closer to the underlying reasons than I am?

  • I really love this idea, but I’m not sure if that’s because it fits with what I’ve done before or if it’s genuinely the best way to go. Where are others on this decision, and what are your thoughts? I’d be particularly interested in hearing from someone who has reservations about it.

  • This decision scares me. Is anyone else feeling this way, or am I just overreacting?

  Often others will see this as an opportunity to bring up their own concerns or fears. If no else is feeling the same way, back off and see if the feeling goes away, or investigate further to see if you can discover some of its underlying causes.

  It’ll Make You Smarter

  This practice will force you to get more rigorous about how you perceive “reality” and how you explain your perceptions to others. Knowing you’re going to show people your road map forces you to be more disciplined about how you share your ideas. You’re less likely to make a flippant comment, toss a half-baked idea on the table, or showboat your opinions, after all, when you know you’ll need to disclose the thinking behind them.

  Knowing you must back up a claim by explaining your thinking also imposes a degree of discipline on how you formulate your views. It forces you to evaluate your mental model, to more curiously think it through, and to ask yourself these critical questions:

  • Why do I think what I think?

  • How valid is my perspective? On what is it based?

  • Are there gaps or blind spots in my way of looking at the situation?

  • Where am I wrong about this?

  • Do other people see it differently, and, if so, how have they gone up the ladder?

  This self-reflection encourages you to hold your views like hypotheses rather than truths, a bedrock trait of someone with high conversational capacity. Full of healthy doubt and skepticism, you’re more curious than dogmatic, more inquisitive than rigid, and more open-minded than opinionated. You’re better equipped to marshal your intelligence in the service of your goals, and to detect and correct errors and gaps in your thinking, which makes you, quite simply, smarter.

  Many highly intelligent people are poor thinkers. Many people of average intelligence are skilled thinkers. The power of a car is separate from the way the car is driven.

  —EDWARD DE BONO

  You’ll Have More Influence

  Evidence-based thinking is more influential. If the architect, for example, just said, “It’s beautiful,” without explaining why in a rigorous way, it’s unlikely he’d affect the police officer’s point of view. But if he claimed it was beautiful, and then explained what he witnessed and how that led him to his position, he’d be far more likely to influence the cop’s perception of the city. In a similar way, if Steve failed to explain the underlying thinking that informed his position, Phil would have been far less likely to give any credence to Steve’s concerns.

  You’ll Be Less Manipulatable

  Learning to craft and explain your own thinking in a more rigorous way also makes you less prone to the manipulative nonsense of others. You aren’t swayed by a claim just because of the person’s authority, good looks, charm, or wit. When someone makes a claim, you paraphrase the classic line of Cuba Gooding Jr.’s character in Jerry Maguire: “Show me the evidence.” This thorough thinking, described best, perhaps, in Carl Sagan’s “baloney detection kit,”6 means you are far less likely to blindly accept the views of others precisely because you are being so disciplined about managing your own.

  A Parting Note on the Candor Skills

  If you’re wondering why I’m pounding this issue of rigorous, evidence-based thinking so hard, consider this observation by Carl Sagan:

  In hunter-gatherer, pre-agricultural times, the human life expectancy was about 20 to 30 years. That’s also what it was in Western Europe in Late Roman and in Medieval times. It didn’t rise to 40 years until around the year 1870. It reached 50 in 1915, 60 in 1930, 70 in 1955, and is today approaching 80.7

  What do we have to thank for these impressive gains? It isn’t opinion-based decision-making, and it isn’t a greater use of Ouija boards, Magic 8 Balls, or the Psychic Friends Network. No, it’s the greater use of more rigorous, evidence-based reasoning in the fields of public health and medicine. We’re making more effective decisions about how to improve our health and extend our lives because our approach to the challenge is now largely grounded in an evidence-based pursuit of informed treatment.

  So let me wrap up by reemphasizing a point I made at the beginning of this chapter: There’s a spectrum of quality when it comes to thinking. At one end you have weak, opinion-based reasoning; at the other end, you have more robust evidence-based reasoning. The more you’re focused on working with others to make the most informed and effective decisions possible, the further you should move to the right.

  From here on out, I expect you to pay attention to how candid you’re being in the moment. Do you have ideas or concerns that you’re not putting forward? How might you get your thoughts out of your head, through your mouth, and into the conversation in a productive, learning-oriented way? How clearly are you stating your position? How effectively are you articulating your thinking? Reflect on meetings. Use drive time, for instance, to consider questions such as these:

  • How candid was I in the meeting today?

  • Were there things I should have said but held back? Why?

  • What triggered me to withhold my view?

  • How can I become more aware of this next time it happens? What was I feeling?

  • What were the signs I was minimizing?

  • What is a more productive way to handle it the next time I notice this happening?

  • How could I have said what I needed to say in a constructive, learning-focused manner?

  Your thinking is a tool you use to make sense and to make choices. So treat it—and share it—with discipline and respect.

  CURIOSITY SKILL #1

  Testing Your Hypothesis

  People who claim to be absolutely convinced that their stand is the only right one are dangerous. Such conviction is the essence not only of dogmatism, but of its more destructive cousin, fanaticism. It blocks off the user from learning new truth, and it is a dead giveaway of unconscious doubt.

  —ROLLO MAY

  You’re learning how to be candid in a more rigorous and responsible way. That’s important. But stop there and you’ll end up, as the Flamethrower put it, “All push. No pull.”* The key to staying in the sweet spot is to now employ your curiosity.

  Curiosity, remember, is an indispensable part of the “production” process in your mental workshop. It’s the reflection of a mindset that places learning over ego and the bettering of your mental models over being com
fortable or feeling right. It’s what distinguishes open-minded people who manage their worldview from closed-minded people who are trapped in it. When properly focused, curiosity leads to sharper thinking and smarter choices because it helps you catch and correct the inevitable disconnects between the mental maps you’re using to make sense of a situation or issue and the actual events on the ground. By the time you finish this chapter and the next, you’ll better understand the importance of cultivating your curiosity, and more important, the skills you need to strengthen to put that curiosity into action.

  But curious about what? To answer that, I invite you to revisit three questions I first shared in Chapter 9:

  1. What am I seeing that others are missing?

  2. What are others seeing that I’m missing?

  3. What are we all missing?

  With these questions in mind, the next step is to seek answers. As you’ll no doubt recall, there are two skills that help you counter your rigorous “push” with some genuine “pull”:

  • Testing your own hypotheses.

  • Inquiring into the hypotheses of others.

  Testing

  Between 560 and 547 BCE, Croesus ruled as the king of Lydia, a country in what today is Turkey. Concerned about the growing power of the Persians, he thought it best to attack them before they grew any stronger. Before making his decision, however, he consulted the oracle at Delphi, who told him, “If Croesus goes to war, he will destroy a great empire.”

  Emboldened by the oracle’s response, Croesus made the necessary preparations and attacked the Persians. The Persians quickly overpowered the Lydians, captured their city, put Croesus in chains, and ordered him to be burned alive.

  The oracle had spoken correctly: Croesus had destroyed a great empire—his own. By failing to consider other, less favorable interpretations of the oracle’s portent, he based a major decision on his flawed, self-serving interpretation. By construing what he was hearing to fit his expectations, Croesus suffered a major disconnect between the mental map he was using to make the decision to go to war and actual events on the ground. Croesus experienced, you could say, a disastrous “Indianapolis moment.”*

  Holding Your Perspectives with a Bias for Learning

  You’ve shared your view and explained it clearly. That’s a good start. The goal now isn’t to sell it, convince others you’re right, or cajole them into seeing things your way—it’s to learn. This is where a curious mind is invaluable. With a mind focused on learning, you’re now asking yourself these important questions:

  • What are others seeing that I’m missing?

  • Do I have a blind spot?

  • Am I misinterpreting the oracle?

  How do you discover, for example, where your thinking is off-kilter? How do you know where your mental model needs some work? How do you detect and correct any mental errors in your analysis? How do you recognize where your perceptions of “reality” are being distorted by a cognitive bias? How do you know if you’re missing some evidence, or making—like Croesus—erroneous assumptions? How do you know if you’re having an “Indianapolis moment”?

  It’s simple. You test your thinking.

  Our decisions, opinions, perception, and memory can all be set adrift by our emotional undercurrents, often without our even noticing that our anchor has slipped.

  —CORDELIA FINE

  With an effective test, you’re treating your view like a hypothesis rather than like a truth; a premise rather than a fact; a provisional point of view rather than a rock-solid veracity. How do you do this in a conversation? Like a scientist publishing his or her research in a peer-reviewed journal, you subject your point of view to scrutiny.

  This is common practice in a range of disciplines. For example, a doctor rarely looks at a patient and immediately begins treatment. She first works to properly diagnose the patient’s condition. She may have an idea about what’s ailing her patient, but she tests her assumptions to ensure she’s looking at the problem correctly. Only then does she take action. She approaches the problem-solving process in this manner because her goal is to make the most informed and effective choices about how to treat her patient.

  In your mental workshop, you’re doing the same thing. Your goal is to expand and improve your thinking by subjecting it to scrutiny. You do this by showing people that you’re holding your view like a hypothesis you want to evaluate rather than a truth you’re trying to protect or to sell. This is unlikely to make you feel comfortable or “right,” but if you want to sharpen your thinking, it’s the only way to go.

  The point of testing your hypothesis is to proactively detect and correct errors in how you’re making sense of things. You’re trying to spark double-loop aha moments before you make important choices. You don’t want to wait until you’re lost and confused before you realize that the view of reality you’re using to reach your destination is flawed.

  Is It Really That Necessary?

  “Testing sounds like a royal pain in the ass,” you might be thinking. “Is it really that important?” It is. Test, originally a word from Latin describing a small container used to evaluate gold and other precious metals, had by the Late Middle Ages developed the meaning familiar to us today: a “trial or examination to determine the correctness of something.”1 As I’m describing it here, testing is straightforward in concept: You’re conducting a trial or examination to determine the correctness of your thinking.

  It may be simple in concept, but it’s challenging in practice. The difficulty stems from how your mind works. Despite being riddled with errors, distortions, and blind spots, your brain hands you a view of “reality” that seems rock solid (which explains why “Indianapolis moments” are so commonplace). The problem, therefore, is that your brain is so adept at tricking you in this way—making you feel you can trust it—testing your view doesn’t feel necessary. Learning to do it wholeheartedly, therefore, is no easy task.

  But while difficult, it’s vital to your goal of thinking clearly and making smart choices. A disturbing number of cognitive biases distort how the human mind makes sense of the world, from the hindsight bias and the bandwagon effect, to the Dunning–Kruger effect and the fundamental attribution error. The most pernicious of them all, the grand distorter, is the confirmation bias—the tendency in which once you’ve adopted a view your brain seeks information that supports that view and in turn misses or dismisses information that contradicts that view. Thanks to your confirmation bias, you have a penchant for getting stuck in your narrow way of seeing things.

  Part of the problem is that cognitive errors are seen as bad; they’re viewed as a sign that you’re not as smart as you should be. Ironically, that’s the wrong way to frame the issue. “Of all the things we are wrong about, this idea of error might well top the list,” says Kathryn Schulz, author of Being Wrong. “It is our meta-mistake: we are wrong about what it means to be wrong. Far from being a moral flaw, it is inextricable from some of our most humane and honorable qualities: empathy, optimism, imagination, conviction, and courage. And far from being a mark of indifference and intolerance, wrongness is a vital part of how we learn and change. Thanks to error, we can revise our understanding of ourselves and amend our ideas of the world.”2

  Do not feel absolutely certain of anything.

  —BERTRAND RUSSELL

  With that in mind, you adopt an anti-confirmation bias and test the views your brain hands you. You treat with great suspicion the view of “reality” your mind is serving up because you’re more interested in making informed and effective choices than in protecting a flawed viewpoint. This opens the path to smarter thinking because you’re actively struggling against your self-serving brain by leaning into difference, seeking contrarian perspectives, and asking people to challenge how you’re currently making sense of things. And if additional evidence or more cogent reasoning can improve your mental map, you change it. (Remember, with your new mindset, getting smarter takes priority over stroking your ego. If y
ou’re not going to test your views, stop wasting other people’s time and talk to yourself in a mirror instead.) The basic idea is that when you’re focused on clearer, cleaner, more rigorous thinking, you’re working extremely hard not to buy into your brain’s own bullshit.3

  You’re Candid Because You’re Curious

  This is a big point. Remember, one reason you’re being so rigorously forthright with the candor skills is so you can test your view of reality. And there is no way to test your view if it’s not clearly explained and open to scrutiny. In this way, candor and curiosity are intimately related. Focus on pooling perspectives to expand and improve your thinking in pursuit of making the smartest choices possible. When you do so you’re not just candid and curious, you’re candid because you’re curious.

  You’re not just candid and curious, you’re candid because you’re curious.

  What the Skill Is

  Trying to improve your thinking by thinking about your thinking can trap you in a cognitive cul-de-sac, so to be a more flexible and adaptive thinker, you need to contrast your thinking with outside perspectives. Testing, therefore, is a form of mental discipline by which you refuse to blindly accept the view of reality presented to you by your brain. Because you’re holding your view like an educated guesstimate rather than an absolute certainty, you’re less attached to your take on things and more open to alternative views. Testing is the triumph of learning and humility over ignorance and arrogance.

 

‹ Prev