The McKinsey Engagement
Page 10
Complicating matters, the buckets we settled on were not necessarily equal in terms of workload. Alan observed:
Halfway through the semester, we realized that some people were going to have much more work than others, determined by the bucket they were leading. For example, with police and fire, we were constantly reevaluating our approach to analysis as well as appropriate metrics (e.g., crime prevention vs. punishment). All of this reworking ended up being very, very time consuming.
Because we were finally happy with our buckets, because we wanted to preserve simplicity, and because we needed to keep responsibilities clearly defined, we decided not to rearrange our buckets or to have team members jump onto other members' topics. Instead, we tried to fix the disparity in required time commitments by allocating nonbucket responsibilities in a way that would make overall workloads more balanced.
As I mentioned, the second issue that made framing the problem difficult was being unsure of what the community wanted. Because of the nature of our project (presenting to a large group of extremely emotionally invested stakeholders in a town meeting), taking the audience into consideration was necessary. Under the guidance of Dr. Friga, we tried to structure our research, our deliverables, and especially our presentation to address the needs of the audience. Alan shared his thoughts about audience awareness:
While framing, it was important that we consider who the audience was and what it wanted to hear; the members of our audience wanted to know what their situation was, what their options were, and what they should do. As we expected, the audience reacted negatively to the inevitable consequence of our recommendation to incorporate—increased taxes.
Because the audience members were so personally affected by the situations outlined in our project—and would be drastically affected by our recommendation, if it were enacted—we had to be very careful in our wording. For some issues (such as improved roads), we presented as if ours was a receptive audience, whereas for others (like increased taxes), our presentation techniques were adapted to fit the demands of a hostile audience.
Interestingly, we ultimately disproved our original hypothesis. At the beginning of the project, we thought that the area of Center Grove should be annexed by Bargersville. Because we subscribed to a hypothesis-driven approach, we started collecting data in an attempt to support our original proposition. However, we soon realized that the data (or at least our assumptions about potential citizens' reactions to such a merger) actually disproved our hypothesis, and we concluded that Center Grove should incorporate as a new city (this would deal with annexation resistance).
WHAT I LEARNED
The most important time we spent throughout the project was the initial period of brainstorming, framing the problem, and debating potential MECE buckets. This took longer than we expected, but it was well worth it in the end. Knowing exactly what we were looking for while researching and understanding where the project was going as a whole helped us to home in on the important issues and to avoid extraneous, unproductive work.
Although persistence and dedication are great qualities, it is important that you be flexible enough to change your mind when research does not support your initial hypothesis. I was impressed with this team's ability to consider data objectively and to amend our research and recommendations accordingly. A key reason we were able to do this is that we kept our eyes open throughout the project—defining our framing and bucketing approach was really an ongoing process, and we were constantly reexamining our underlying assumptions.
DELIVERABLES
Figure 5-4 Frame: Information Tree
Figure 5-5 Frame: Decision Tree
Figure 5-6 Frame: Articulate the Hypothesis
6
ORGANIZE
Figure 6-1 TEAM FOCUS Model—Organize
CONCEPT
Once the problem is framed properly, we need to organize our analysis efforts in a very strategic manner. Remember that our primary goal in this process is both increased effectiveness (doing the right thing) and increased efficiency (doing it well). The underlying assumption here is that many team problem-solving adventures could be improved. What are the typical issues associated with nonstrategic approaches to team problem solving?
I have generally found that there are three primary issues with the Organize bucket of analysis. All of these issues stem from problems that probably arise because of poor framing (see the system dynamics discussion in the previous chapter). The first issue occurs when a team organizes around the wrong things. Essentially, the issue tree either is not done in an MECE way or, more commonly, is not properly prioritized. The next issue is related, and it has to do with the allocation of resources in the team problem-solving process. The easiest (but not the most efficient) way to assign people to tasks is just to split things up evenly, without giving much thought to the workload and/or the impact of each area on the eventual end product.
The final mistake that teams typically make in the Organize phase is failing to design a work plan that is centered on testing the hypotheses developed during the Frame phase. The scientific method requires the proving or disproving of hypotheses, which are essentially potential answers to the key question. Spending a lot of time and energy gathering data on a more general basis (i.e., gather as much data as possible and we will see what we find—inductively) may be helpful in some circumstances (e.g., when you have absolutely no previous experience in this area), but it generally leads to gathering and analyzing irrelevant data, which is inefficient.
RULES OF ENGAGEMENT
Every team organizes in some fashion or other. The key differentiator is how strategically the team is organized. I have been part of many teams that have spent very little time thinking about the work streams because they wanted to get to what they saw as the real value-adding work quickly—data collection. The truth is that the organization is the value-adding work, and top consulting firms like McKinsey give careful thought to the problem-solving approach. Many consultants refer to this process as developing a "work plan," and I break that "work plan" concept down into two separate components, as described in the first two Rules of Engagement (process map and content map). The final rule pertains to the all-important "story line" that serves to guide the entire process.
RULE 1: DEVELOP A HIGH-LEVEL PROCESS MAP
The first step in organization is to create a process map (this is my terminology, and it is my understanding that consultants at McKinsey may not use this exact nomenclature; still, they always create a process map, whether they call it that or not). The process map does not need to be complex or extremely specific. In fact, I was a bit surprised by the level of formality of the initial process maps at McKinsey. While working in the turnaround and bankruptcy practice at PricewaterhouseCoopers, we created meticulous process maps, partially because we were part of an accounting firm and partially because we had to report our time charges in six-minute increments. The truth of the matter is, the process map for a typical consulting project should be straightforward and should answer only a few key questions:
What needs to be done at a high level?
Who will do what?
What will the end result look like?
When will it be done?
Of course, during a project, there will be some adjustments made in the schedule, and the process map may need some updating. If the process map is kept simple, such updates will not require an onerous effort. Of course, if you are working on a longer project or one that involves more team members, the complexity of the process map increases. However, my tip is not to put too much specificity in the process map or to start combining it with the content map, which is described next.
RULE 2: CREATE A CONTENT MAP TO TEST HYPOTHESES
The content map is the element of the organization process that can have the most impact on the efficiency of a project. It is here that the team determines its analytical priorities and its approach for testing hypotheses. Again, while McKinsey consultants will not necessar
ily use the term content map, I find it to be helpful when teaching the organization component to other consultants, executives, and students.
The content map is very engagement-specific. It is an outgrowth of the framing process discussed in the previous chapter. Once the team determines the key question, outlines the issue trees (information and decision), and creates one or more hypotheses, the testing must begin. An efficient team will focus its energy on identifying and testing the most important subhypotheses that must be true if the overall hypothesis is to be true. You can test any number of statements that may seem related to the hypothesis, but what must you really know to determine if the hypothesis is valid?
The challenge, of course, is related to the determination of those supporting thoughts or subhypotheses. For Z to be true, X and Y must also be true, and how are we going to prove the truth of X and Y? Because this process is so context-specific, it is impossible to tell you, a priori, what the subhypotheses should be. When I teach this material, I offer the following advice:
Use frameworks from business school and from textbooks to generate ideas.
Examine past projects that had some similarity to this project (e.g., from the same industry, the same function, the same business issue, and so on) to see what has been used in the past.
Create a diverse team to participate actively in the brainstorming process (see Chapters 1 through 4 for more advice related to this process).
RULE 3: DESIGN THE STORY LINE
The final Rule of Engagement in this chapter relates to a critical organizing element for consulting projects: the story line. I cannot tell you how many times I heard the question, "What's the story?" in the halls of McKinsey. The story line is essentially the outline for the final presentation at the end of the project. This is one of the secrets of efficient problem solving: you begin working on the final presentation story very early in the project—almost on day one. Right after the framing is finished and before systematic data gathering commences, the team should develop an initial story, brainstorming about both the actual story line and how to deliver it.
What happens if the story changes? It will! Count on it. One of the core consulting skills is flexibility and the ability to adapt. As the team tests the hypotheses with data, some hypotheses will be proven false; in fact, in the end, the entire story may be very different from the original version (we experienced that to a certain degree in our case study). This is normal and to be expected. The real risk in this whole process is if there is no flexibility and people become personally attached to their initial hypotheses, focusing simply on proving them without considering disconfirming evidence.
The story line migrates into a "storyboard" as data are collected and key insights are developed. One way to think of this is that the story line is the ongoing outline; it can perhaps be best portrayed on a portrait document with supporting pyramids (this concept will be elaborated upon in Chapter 9). The storyboard, then, is the translation of the story into a landscape slide deck, with insights at the top and data in the middle (also discussed more thoroughly in later chapters).
All three of these Rules of Engagement are illustrated in the deliverables for the case study at the end of the chapter. The exact format and obviously the ingredients of the maps will be different for your project, but these examples may serve as a helpful template as you move forward.
OPERATING TACTICS
The Operating Tactics for the Organize element of the TEAM FOCUS model are:
Tactic 28: Maintain objectivity as the hypotheses are tested during the project.
Tactic 29: Use frameworks as a starting point to identify issues for analysis.
Tactic 30: Explicitly list the types of analysis and related data that the team will and will not pursue (at least at that stage in the project life cycle).
Tactic 31: Revisit this list if the hypotheses are modified.
STORIES FROM THE FIELD
STORY FROM THE FIELD—1
Topic: Taking the time to make an educated hypothesis leads to project efficiency. Our first Story from the Field comes from an ex-McKinsey consultant who highlights how important it is to have the right hypotheses to ensure efficient analysis.
There is an important caveat to the McKinsey problem-solving approach. Consultants are pushed from day one to engage in hypothesis-driven problem solving. I was once asked to be involved in a team that was struggling. The engagement was nominally an effort to improve the organization of an oil and gas company. The team had already conducted two or three team problem-solving sessions, including pulling in organization experts, and generated lots of hypotheses to test. This effort was not leading the team to any ideas or insights into how the company's organization needed to change to make the company more effective. The team and the client were becoming a bit frustrated.
When I met with the team, the first question we discussed was, "How does this oil company make money?" The short answer was that no one knew. Two days of work later, we had a good handle on the company's sources of value creation. We used this material to focus another hypothesis-generation meeting on the company's possible organization issues. This meeting yielded a very robust set of hypotheses, and the study proceeded very smoothly from that point forward. My big "ah-ha" from this experience is that hypothesis-driven problem solving is a great approach, but you have to have gathered enough basic facts beforehand to inform the effort, or else you can end up wasting lots of time.
STORY FROM THE FIELD—2
Topic: Failure to implement well-framed, hypothesis-driven approaches leads to extraneous work and inefficiency. Fred Humiston, who is currently with Celgard, recalls two McKinsey projects where the issue tree development process certainly had room for improvement.
On two of my projects, we learned some valuable lessons about the importance of getting the issue tree right in the initial stages of the project. I had been trained in solving problems in a rigorous way (McKinsey's key to success), but sometimes we failed to rigorously adhere to our problem-solving method, or we abandoned it altogether. As a result, certain critical issues remained unresolved, and both our efficiency and effectiveness suffered. The first project where this was important was one involving a financial institution that asked the firm to help it get into a line of business that was considered standard for such organizations. In fact, the financial institution had been in this line of business before, but had sold it and was now looking at reestablishing it. The key questions were, "Should the financial institution enter this line of business again, and, if so, how?"
We established the issue tree (decision tree) with two high-level buckets, "do it yourself" vs. "partner with somebody," but the tree soon began to break down. We struggled with the next level of the issue tree—for "go it alone" we looked at the client's internal capabilities and constructed a financial model, whereas for the partnership situation we examined the pros and cons of each potential partner. This really wasn't an apples-to-apples comparison between the two options, but the team leadership decided not to come up with any clearer method to compare them. Two months later, we had evaluated the two options completely on their own, but without any meaningful comparison. Ultimately, we came up with a very simple answer, which was that any financial institution of this size should be able to profitably operate the line of business that we were investigating, and if they could do it on their own there was no point in partnering. However, only a limited portion of our work, and certainly not much problem solving, actually contributed to reaching this conclusion. To be honest, we could have saved the client money and time and had a better experience if we had focused more on framing before diving in.
The other project illustrated how important it is in working with a client to have that client understand the value of a clear problem-solving approach. This engagement involved a huge retail client that was deciding whether and where to expand overseas (not an uncommon project for McKinsey). Since it was only my second engagement, I was reluctant to question the p
roblem-solving approach or the reaction to client input. We decided not to use a hypothesis-based method because the client didn't want us to have preconceived notions—it told us to look at the whole world and boil down the results. I sensed that something was amiss, and, looking back, I guess this should have been a red flag. However, this was a big client, so we wanted to please it and didn't push back. While we had a sense that we should be framing this in a strategic and financial issue tree, we spent six months doing a lot of work that was of almost no use to the client at all. We spent endless hours analyzing companies and country portfolios that only tangentially related to the key question, "Should the company expand internationally?" It could have been a golden moment for McKinsey's strategic acumen to shine through, an opportunity to define what a major player wanted to be in the world, but in the end the quality of our work was, in my view, below the traditional McKinsey standard. Oddly enough, the client was pleased with the work, but there was a palpable sense of let-down throughout the McKinsey team.
STORY FROM THE FIELD—3
Topic: Clear organization drives the success of one project, while failure to align team members toward a single goal contributes to mediocre results in another. Mario Pellizzari, who spent four years in the McKinsey office in Milan and is now with Egon Zehnder International, describes some pluses and minuses of organizing around hypotheses.