by Craig Weber
Steve’s conversation provides yet another good example: “Phil, it seems a few people are just a little nervous about talking with you because, well, you can be—and this is not a fault—really passionate sometimes.” This watered-down position doesn’t serve Steve’s goal of helping Phil identify a potential gap between what he wants (people telling it to him like it is) and what is happening (people not telling it to him like it is).
But Steve could have undermined the purpose of the conversation if he had erred in the other direction and expressed an unnecessarily harsh position: “Phil, you act like a Philistine by verbally abusing people in nearly biblical ways and yet expect them to ‘tell it to you like it is.’ I can only think of three reasons you’d do that—ignorance, insincerity, or foolishness. Which is it?”
By inviting both misunderstanding and defensiveness, both approaches subvert the purpose of the feedback: to make things better. Steve finally settled on a more balanced approach that was direct and no-nonsense, but without any unnecessary harshness: “Phil, I’ve never worked for someone who’s more open about his need for timely and accurate information as you, and I applaud that. But despite your good intentions I think you act in ways that make it really difficult for people to do what you’re asking.”
Clarifying Points
It Can Be a Feeling
Stating how you’re feeling is often a great way to frame your point. A clear position can declare confusion, uncertainty, or ambivalence:
• This decision makes me very nervous.
• I feel conflicted about the best way to go on this change process.
• The thought of making this change scares me.
• I love this idea for two big reasons.
• I feel torn between these two options.
It Can Be Tentative
You can lower defensiveness by letting people know you’re not wedded to your current position and that you’re holding it hypothetically just by the way you put it forward:
• Here’s my “going in position” on the issue . . .
• Right now, I’m looking at the situation like this . . .
• My current take on this problem is . . .
It’s More Authentic
Speaking in such crisp, no-nonsense terms is more authentic, which generates more trust and respect.
It’s Easier in Some Circumstances, Harder in Others
This skill is harder for some people than it is for others. It is also harder in some circumstances than it is in others. The Flamethrower, for example, was a natural with this skill.* When his internal reaction to a colleague’s suggestion was, “That won’t work,” he’d simply say, “That won’t work.” That’s 10 out of 10 on the position clarity scale.
But if you’re like me and struggle with a strong tendency to minimize, this skill takes more practice to master. I tend to sacrifice a clear position to avoid being confrontational, rocking the boat, looking like a nonteam player, hurting feelings, or making myself look foolish. So, putting forward an effective position is an unnatural act, and I’ve worked hard to strengthen my ability to do it well. I realized that if I didn’t learn to state my views directly, I risk the three problems I mentioned earlier: I’m more likely to be misunderstood; to have less influence; and to appear ill-informed, weak-minded, or two-faced.
In situations where your need to minimize is being triggered, a well-crafted position is often the first casualty. But it’s exactly what’s necessary if you’re to stay in the sweet spot.
* Don’t remember the Flamethrower? Revisit pages 46–47 and 104–105 in Conversational Capacity.
CANDOR SKILL #2
Helping Others See Your Thinking
Simple can be harder than complex: You have to work hard to get your thinking clean to make it simple. But it’s worth it in the end because once you get there, you can move mountains.
—STEVE JOBS
To get a thought out of your head and into the heads of others, putting forward your position is a start, but if your goal is to pool perspectives in the pursuit of learning, this skill, by itself, is inadequate. Like an elementary school child doing long division homework, you need to show your work. You must explain how you arrived at your position. You need to share your thinking.
A useful analogy is mapmaking. When you’re helping other people see your point of view, you’re doing your best to present the mental map—the cognitive cartography—you used to arrive at your position. Your position tells others where you stand on an issue. Your thinking shows how you got there.
Your Thinking Has Two Facets
To describe your mental map in an accessible way you do your best to share the evidence you’re using and how you’re interpreting that evidence. Your view on any issue is always a mix of these two components, and if you’re going to lay your thinking on the table so others can examine it, you need to share both.
When Steve chose to raise his concern with Phil, for instance, he prepared for his conversation by exploring important questions with his colleagues: What is leading us to our concerns? What are we seeing or hearing? The answers to these questions allowed Steve to share specific, concrete examples of what he was seeing and hearing in the workplace: the hallway conversations, examples of Phil’s reactions in meetings, the coaching he provided to the shell-shocked project manager, as well as the new nickname people had for their interactions with Phil—“The Gestapo Interrogation.” He went on to add another compelling piece of evidence: “It’s so bad that I was actively counseled by a lot of my colleagues not to even come in here and bring this up with you. One even suggested it would be ‘career suicide.’”
But Steve also prepared for his conversation by thinking through his interpretation of this evidence, and how he and others were making sense of it. “I’m concerned that if this continues people will be increasingly afraid to bring up important information and the impact on the business will be dire.” This is not evidence—it’s how Steve is making sense of the evidence—but it’s a pivotal part of his thinking.
Explaining Your “Ladder”
For another example, reflect back to the story of “the cop and the architect” in Chapter 6 of Conversational Capacity. As the cop and the architect walked around the city of Chicago together, they each ascended the Ladder of Inference in very different ways. When asked to describe the city one stated, “It’s a dump,” while the other declared, “It’s beautiful!” They each experienced the city differently because they each focused on different aspects of the environment, which led them to interpret the city in contrasting ways. As I wrote in Conversational Capacity:
Whenever we talk with other human beings, to one degree or another, we’re just like the cop and the architect. We all filter the world around us in distinct, biased, and incomplete ways. If our conversational capacity is high, we know that we each have a unique take on “reality,” and we want to get our “ladder” into the conversation as clearly and candidly as possible.1
The process for doing this is fairly simple. Put out a clear position, and then explain the thinking behind it. By explaining your thinking to others, you’re describing how you’ve gone up your ladder.
Evidence
When it comes to your cognitive cartography, not all mental maps are created equal. There are two basic paths to a point of view:2
1. Biased, myopic, opinion-based thinking.
2. Rigorous, expansive, evidence-based thinking.
If your goal is to double down on ignorance, cling to your current perspective, and make half-baked choices, the first kind of thinking is optimal. If, on the other hand, your goal is to learn, to create more accurate mental maps, and make more intelligent choices, the second kind of thinking is the only way to go. It’s that simple. Dull thinking or sharp thinking. Those are your options.
Some people prefer the first kind of thinking. You probably know a person who wields fervent opinions unencumbered by evidence. They accept ideas or views with little data, especially i
f those views support their current thinking. Such people live in what one engineering team I worked with referred to as the DFZ, “the data-free zone.” Research shows such people are more inclined to believe in conspiracy theories and pseudo-profound bullshit. At it’s most softheaded extreme, these sloppy “thinkers” are prone to conspiracy theories and spend a lot of money on stuff sold on late-night TV via infomercials. They tend to believe everything they read on the Internet. “Dude, I just joined the Flat Earth Society!” or, “Did you hear that NASA found life on Uranus but they’re covering it up? No. Seriously. My friend saw the photos online.”
Most of us fall somewhere in the middle, neither perfectly rational and evidence-based thinkers, nor mouth-breathing morons plagued by an allergic reaction to facts. When your aim is rigorous thinking and smart choices, however, you focus your beam of attention on evidence. This imposes a degree of discipline on how you look at the world by forcing you to ask important questions:
• Why do I think what I think?
• How valid is my perspective?
• Do other people see it differently? If so, what evidence do they have? How might they be interpreting the evidence in ways I’m not?
Questions like these encourage you to hold your views like hypotheses to be tested rather than truths to be sold. It is this dedicated focus on evidence-based thinking that separates smart, discerning, rational people from dense, bias-driven dullards.
I’m coming on strong here, but it’s hard to overemphasize the point. With high conversational capacity, you’re not engaged in the childish sport of “Whose opinion is the sexiest?” or “What’s the most convenient way to think about this issue?” and you’re not playing the kiss-ass game of “Let me find out what the boss or the group thinks so I can espouse an agreeable opinion.” No. Dedicated to learning and aware of your cognitive limitations, you ground your thinking on evidence rather than convenience, comfort, authority, bias, eloquence, flashy presentation, or social standing.
Basic Kinds of Evidence
There are different kinds of evidence you can provide to help people see how you’ve reasoned to your position:
• Directly observable (sensorial) evidence. As you walk up a city street, for example, you might see people sipping coffee at a sidewalk cafe, smell the tacos being sold by a food truck, feel the warmth of the sun on your face, and hear the siren of a passing ambulance. In a meeting, you might focus your attention on the words on the agenda, the comments of your manager, the body language of your colleagues, the sound of the gardener’s leaf blower outside, or the smell of coffee brewing in the break room.
• Validated evidence. This is evidence backed up by scientific methods. For example, to support your view, you might provide a research paper from a peer-reviewed journal.
• Statistically or mathematically measurable evidence. To help others see how you arrived at your view, you might share a financial spreadsheet, a profit-and-loss statement, or an engineering schematic.
• Analogic information. You might compare similar things: “This isn’t that different from our acquisition in 2008 . . .” or “Here are examples of two other companies, similar to ours, that have tried to make this same change and failed.”
• Systemic analysis. By providing an organizing framework for evidence, systems thinking tools—such as behavior-over-time graphs, causal-loop maps, or stock-and-flow diagrams—help you describe the underlying structure of an issue or problem (what systems thinking expert Chris Soderquist calls the “physics of a system”) in a rigorous, evidence-based way.
• Anecdotal evidence. Less rigorous and more prone to error, anecdotal evidence might include your previous experience with a comparable decision, or conversations you had with someone who had a similar experience.
Why Your Thinking Matters
Rigorous, evidence-based reasoning is pivotal to informed decision-making. It’s the only way to prevent sexy bad ideas from trumping boring good ones. Remember, your goal is not to bounce your opinions back and forth in a game of poppycock Ping-Pong. It’s to pool, evaluate, and improve the thinking being used to make sense of an issue, to solve a problem, or to make a decision.
Rigorous, evidence-based reasoning is pivotal to informed decision-making. It’s the only way to prevent sexy bad ideas from trumping boring good ones.
Interpretation
Imagine an engineering team has developed a new idea for their product that they believe will provide a major boost to the business. Excited about their work, they pitch it to the management team. After explaining their idea in technical detail to the team they switch off the presentation and say, “So what do you think?”
The sales director jumps in immediately: “This is fantastic! We’ll make a fortune! We’re obviously going to do this. How soon can we start?”
The corporate attorney, having heard the same presentation, responds to the sales director with a simple question: “Do you like prison food?”
These contrasting reactions aren’t due to evidence—the sales director and the corporate attorney each witnessed the same presentation. Their contrasting reactions are due to their interpretations of the evidence. They’re using different intellectual filters to process the information, and this leads them to divergent conclusions. When interpreted through the sales lens, the evidence presented by the engineers leads to a strong position: “This idea is fantastic.” But when interpreted through a legal lens, the same evidence leads to a conflicting position: “This idea is a felony.” Their differing areas of expertise lead them to screen and weigh the evidence in very different ways.
The main point is this: Very often it’s not the evidence you’re paying attention to that makes your perspective unique; it’s how you’re making sense of it. So, when you’re in your mental workshop laboring to produce smart choices, you need to consider both the evidence and how you’re filtering it. Both are essential if you’re to get your entire Ladder of Inference into the open so others can respond to it. To do this you’re asking yourself important questions:
• How have I gone up the ladder on this issue?
• What is the evidence I’m focusing on?
• What are my gut reactions to it?
• Where are these reactions coming from?
• What are the logical assumptions I’m making?
• What do I think the evidence is telling us?
Ladder of Inference
The version of the Ladder of Inference I shared in my first book, Conversational Capacity, provides a useful way to think about how the evidence you pay attention to and how you then interpret that evidence leads to your position on an issue.
Here again, Steve provides a good example. After sharing with Phil what he was seeing and hearing around the company (evidence), he explained to Phil how he was making sense of it all, and what concerned him about the evidence he was sharing (interpretation): “My biggest concern is that if this continues you’ll get less and less information about what’s really going on in the business and the consequences could be severe.” By providing both the evidence and how he’s making sense of it, Steve shared how he was “going up the ladder” with Phil.
Again, illustrating your thinking by clearly describing both your evidence and how you’re e interpreting it fosters more learning on two fronts:
1. It makes you smarter by helping you improve your thinking. There is no way to effectively test your view if you don’t put it forward clearly. You’re candid, in other words, because you’re curious.
2. It makes other people smarter. You’re making sense of an issue in a unique way and your views can help other people detect and correct blind spots, biases, and errors in how they’re making sense of the situation.
A Few Clarifying Points
Don’t Overdo It
Francis Flaherty shares this gem in his book, Elements of Story:
Don’t ask Barbara how her morning went. She’s the type who will give you a full accounting—everything abou
t breakfast (bran muffin, orange-peach-mango juice, used up last coffee filter), everything about the trip to work (left 10 minutes late, made up the time on the Taconic State Parkway), and about everything else.
Barbara is not wandering from the question. She just thinks that her listener has a boundless appetite for facts and a Texas-size capacity to absorb them.
Flaherty then provides sharp advice that applies equally well to how you put forward your thinking: “Don’t be like Barbara.”3
It’s better to explain your ladder in bite-sized chunks than to share too much information and risk losing your audience because they can’t keep up with your train of reasoning. You don’t want people to check out or cut you off by saying, in essence: “I’m sorry to interrupt, but does your train of thought have a caboose?”
Over-explaining is often a sign you’re trying to sell your view rather than describe it. It can also come across as haughty and condescending, as if you’re underestimating the intelligence of your audience. So don’t overload your listeners. The additional gem of information you think you’re providing may instead be the straw that breaks the camel’s back.
[Thinking] improves in direct ratio to the number of things we can keep out of it that shouldn’t be there.
—WILLIAM ZINSSER
Instead of sharing all you know, lay out just enough to illustrate your key point so you can test it. If your colleagues want to know more about your perspective, they can always ask for more information. (In fact, you’ll be testing your view shortly, which provides that very opportunity.) Remember, the conversation serves a purpose—improving your mental maps of reality so you can make better choices. So share only enough information to serve this purpose and no more. This can be accomplished verbally, with reports, or with visuals such as systems-thinking tools like causal-loop maps trend-over-time graphs, or stock-and-flow diagrams. Whenever you’re trying to share your view with others, the important question to ask yourself is: “What’s the most efficient way to clearly share my thinking about this issue?”