The McKinsey Engagement

Home > Other > The McKinsey Engagement > Page 12
The McKinsey Engagement Page 12

by Paul N. Friga


  Factiva—a great starting point

  Market Research Monitor—short industry reports and market sizing

  InvestText Plus—in-depth industry reports by investment banks and others

  Mergent Online—deep data on firms (can download financials)

  Reuters Business Insight—specific industry reports

  MarketLine—quick hits on companies and industries

  S&P NetAdvantage—well-regarded industry data

  Frost & Sullivan—international industry data

  Hoover's—company data on the big firms

  Periodicals and newspapers—current news (e.g., Business-Week, the Wall Street Journal, and so on)

  Nielsen—market metrics

  Google—the catch-all

  Just as with any tool, the only way to become proficient is to practice. I recommend that you force yourself to try all of these tools (and others) at least once to get a sense of what they can do for you. Ultimately, you will find your favorites and learn how to navigate them extremely well. Google continues to advance its offerings in this space every day and even offers sites for searching for data, sharing documents, and working live on team projects.

  My last suggestion is brief but critical: always document the source of your data on your charts. This is important for credibility (the idea is sufficiently supported), authenticity (it was not made up), and traceability (we can go back to the original source at a future point).

  OPERATING TACTICS

  The Operating Tactics for the Collect element of the TEAM FOCUS model are:

  Tactic 32: Design ghost charts to exhibit the necessary data that are relevant to the overall story.

  Tactic 33: Use primary research, and especially interview the client personnel; document interview guides ahead of time, and share the insights with the team in written form within 24 hours.

  Tactic 34: Always cite the source of the data on each chart created.

  STORIES FROM THE FIELD

  STORY FROM THE FIELD—1

  Topic: Strategic data collection that includes heavy use of ghost charts makes all the difference. Our first data-gathering story from the field is a classic tale of rags to riches that illustrates the importance of strategic data collection and what happens to a team without it. Brigham Frandsen recalls a project that was simultaneously his worst and his best project at McKinsey.

  This particular engagement was almost a textbook example of the pitfalls that result from not consciously following the Rules of Engagement in a study. However, it also demonstrates the rewards that can result from stopping, backing up, and deciding as a team to turn things around. What looked like a disaster of a study turned into one of the highest-impact and most personally rewarding studies during my time at McKinsey.

  McKinsey had done a number of highly successful studies with subsidiaries in several countries of a large central European retail-focused bank that resulted in an exponential increase in sales growth. As a result, one affiliate was eager to have us do the same thing for it, and as soon as possible. Thus a team was hurriedly assembled, but it included no engagement manager (there was only a one-year associate who was hoping to make EM), no partner, and nobody who had been involved in the other sales studies. However, eager as we were, the two other associates on the study and I dived in. Going in, we knew little except that we needed to have an implementation plan for revolutionizing sales and initial impact estimates for a steering committee in two weeks.

  We didn't "focus" in terms of data collection—we gathered everything and anything that we could get. As we hadn't framed or organized well, we had no direction for what data to collect. All anyone vaguely knew was that bank affiliates in other countries had had amazing successes in terms of sales increases as a result of these other McKinsey studies, and we wanted to accomplish the same thing. Unfortunately, instead of remedying the problem by leveraging the right expertise, we ran around like headless chickens, interviewing the heck out of the client, asking for its entire data warehouse in Excel format, and putting numbers on charts for the steering committee.

  All this came to a head the day before the steering committee meeting, when the absentee partner decided that he wanted to see what he would be helping to present the next day. Not surprisingly, he was not very encouraged by our results thus far. Because of lack of time, he wasn't able to sit down with us to jointly come up with a plan to get the material ready; instead, he got into a yelling match with the aspiring EM, and then gave us associates our orders for the pack, which involved an all-nighter preparing analyses and charts for the next day. That was my worst day and night with McKinsey. The next day, we younger associates were not even invited to the steering committee meeting, which by all accounts didn't go so well anyway—we hadn't gotten any sort of client buy-in ahead of time.

  Now, the turnaround. That very afternoon (after the meeting), we sat down as a team (this time with the partner) and slammed the brakes on that runaway train. The very first thing we did was apply the TEAM principles, which we should have done from the beginning. We each took a turn expressing our impressions of what had gone wrong thus far, what our expectations were (as a team and as individuals) for the project, and how to proceed. Among other things, when it was my turn, I said that I absolutely needed to see the end from the beginning (no more blind 10-hour analyses that end up being pointless), and no more all-nighters. I play basketball on Thursdays at 8, and I'm home to put my son to bed at 8:30; if I have to, I'll go back to work when he's asleep.

  One result of that team meeting was a feeling on every-body's part that we could make this a rewarding study. More practically, we made a plan to bring in two associate partners who had done similar studies. Within a couple of days, we had a complete framework combining lean banking with sales improvement, and we had the chapters "ghosted" out for the packs for the next two steering committee meetings and the final product, based on that framework. I then took the chapters that I was assigned and literally wrote out all the slides that I would need, complete with titles and ghost charts. In the process of doing this, we were able to clarify the key goals and develop a plan for gathering data, deploying the client team, and getting results. Having this done was extremely empowering for the rest of the study, and we ended up being very successful.

  STORY FROM THE FIELD—2

  Topic: Working with a large client team helps with data collection as well as buy-in. Our second story is from an ex-McKinsey consultant who learned about the challenges of working with client team members in data collection and analysis.

  For this project team, we had one of the largest groups of client personnel on a project that I ever experienced at McKinsey. The client freed up a significant number of people to work on the project full time, and this large client team ultimately contributed a great deal to the success of the project. Specifically, there were five primary McKinsey people, and each of them led a four- or five-member team of client employees. Because each McKinsey person had to lead a subteam of non-McKinsey people, we all had to do a lot of background work in order to be prepared.

  One challenge was that the teams of client employees tended to have low computer, analytical, and data-gathering skills; however, we knew that the only way to succeed was to engage the client, to understand the client, and to build a solution that would work for the client. We were able to accomplish this because there was a great amount of client commitment to achieve those goals.

  This commitment and focus on the client team carried over into data collection, deck creation, and project reviews. Normally, in these reviews, the McKinsey team initially maps out the project and gives periodic project updates to senior-level people. This usually takes place in a boardroom, with corporate-level executives and the McKinsey team discussing a slide deck in a vacuum. For this project, though, we decided that it would be necessary to have a more engaging project review. We wanted all the people on the client team to review their findings and to see the progress they were making; it was v
ery helpful for them to see that their senior-level people were responding positively to the work they had done. So, the McKinsey team made a deck, but it also made posters (three or four for each subgroup). The members of the client team then would walk the client leadership through their progress, describe the key findings, and show the data that supported the conclusions.

  STORY FROM THE FIELD—BUSINESS SCHOOL EXAMPLE

  Topic: Periodically touching base with a partner helps to keep an intern on track with collecting relevant data. Our business school example comes to us from Ben Kennedy, who interned at a top consulting firm during the summer of his MBA program at the Fuqua School of Business at Duke University. We heard from Ben earlier in Chapter 6, and he dives into the data-collection issues a bit more here.

  Our project involved developing a plan for a client that served multiple industries and needed short-term growth. More specifically, our hypothesis was centered on testing a targeted list of industries to determine the highest-priority areas of focus. I was charged with one of three work streams on the project and worked directly with a senior-level partner.

  I spent a lot of time gathering and analyzing data. I was able to find many insights in the data on my own, but I really appreciated the discussions with my partner related to the data-collection and analysis process. I would gather and present the data with the partner, and the discussions would help me identify new insights or find holes in my analysis (to be filled by more data). It would obviously have been inefficient to have the partner look at everything I gathered, and identifying what was important was a key part of my job; however, the regular (although brief) update meetings helped keep the research on track.

  As another note, during my internship, I also learned how important data are to support a point or, in some cases, refute what you hear in an interview. From time to time, you will get "opinions" during a client interview that ultimately are a bit subjective and not supported by evidence. This may be driven by political issues within the company, and it is important to stay as objective as possible.

  CASE STUDY

  Tim here. Our case study learned over time that data are the currency of consulting projects.

  WHAT WE DID

  In our data-collection process, sharing information was invaluable. We introduced a very team-based approach early on in our project with the fact packs we created. We each collected relevant information about our bucket areas in order to help educate the other team members on the basic, important points of each category; that way, everybody had a better understanding of the whole situation. We continued sharing information throughout our project, e-mailing one another relevant articles and other sources that we ran across in our own research. I found that information sharing was even more important in this project than in others I've worked on, because we had, even collectively, virtually no experience with the U.S. city incorporation process. Everything was so new to all of us that it was very important for us to help one another get up to speed.

  Another critical component of our information-gathering process was finding a key contact. I managed to find a key contact who had a great depth of experience in precisely the area I was researching (the legal aspects of incorporation and annexation), and I leaned on him heavily for education and references to others with helpful information. He was even able to answer a surprising number of questions for other team members, and his comments helped to keep the entire team moving in the right direction. Sometimes a project requires the collection and synthesis of large amounts of data from diverse sources; other times, a single source serves as the sword that cuts through the Gordian knot. This was an example of the latter.

  Alan was also resourceful enough—and lucky enough—to find a great primary resource. He had an experience similar to mine, where his contact either knew the answer to his questions or knew whom to ask. He commented, "We should have gone straight to the source in more situations. One of the biggest takeaways was that we dove into all the FBI and CIA analysis, when we should have called the fire and police chiefs right away and just asked their opinions."

  Another helpful practice that we implemented was storing the results of our own research in slide format. As we collected data, we all put it right into slides (instead of Word documents or other formats). This made it very easy for us to share our research and to organize it in such a way that it could be easily found later.

  Additionally, we "outsourced" a lot of our research. As I described previously in Chapter 3, we used the Consulting Academy to create subteams that helped us with our research. We all found this to be extremely helpful, but Alan in particular had a very productive week: "Before we went into the academy week, I was spinning my wheels somewhat with all my individual work. Academy week was a great way to get a lot of research done quickly, and we even came out with a lot of slides made."

  WHAT I LEARNED

  Go to the source! Finding a key contact with deep knowledge about your project's topic is invaluable—especially in a situation like this, where you don't have much prior experience or existing knowledge about your research area. I certainly spent a lot of time becoming familiar with issues and exploring dead ends before I found my key contact. In this project, I realized that people are generally happy to help you, and that it's much easier to find an area expert and ask him or her questions than to try to wade through an abundance of new information yourself.

  I will definitely try to be more cognizant of the resources I have at my disposal in the future, and I will push myself to be more creative with them. Using the Consulting Academy students to supplement (and really drive) our research was a huge help, and with our time constraint and the size of our project, it was invaluable.

  DELIVERABLES

  Figure 7-2 Collect: Ghost Charts

  Figure 7-3 Collect: Interview Guide

  Figure 7-4 Collect: Interview Summary

  Figure 7-5 Collect: Key Secondary Sources

  8

  UNDERSTAND

  Figure 8-1 TEAM FOCUS Model—Understand

  CONCEPT

  This chapter addresses the real value that a consultant adds to a project—identifying the takeaways from data collection. The magic is then converting them into meaningful insights that form the essence of the answer to the key question. If the team has been following the scientific method as described in previous chapters, this is the stage where the original hypotheses are either supported or refuted. In most cases, the hypotheses are modified as the insights flow in, and these refined hypotheses become the ultimate recommendations that are presented to the client, which are discussed in detail in the next chapter.

  The biggest challenge that teams face in the Understanding phase is developing the highest-quality insights. Teams describe this part as "hurting a little" from a mental perspective, as it requires additional processing power. The Rules of Engagement and Operating Tactics described in this chapter all provide guidance that is designed to make the Understand element as effective as possible. It is critical to build the right answer for the client based upon the supporting data and intuitive observations.

  RULES OF ENGAGEMENT

  You have gathered a lot of data and information, and much of it appears to have relevance to the key question. Now, the team must digest the data and articulate the insights that will eventually become the basis for the final recommendations. The Rules of Engagement for doing this as efficiently as possible are given here.

  RULE 1: IDENTIFY THE "SO WHAT(S)"

  Two of the most important words within McKinsey are "so what?" This term has immediate connotations of testing the relevance of the data that have been gathered for a particular study. To operationalize this concept, it may be helpful to ask and answer one of three questions: "What is the impact of this insight on the project team's tentative answer?" "Will this insight change the direction of our analysis?" and/or "Will implementation of this insight ultimately have a material impact on the client's operations?" The answer to the questions will
ultimately be the answer—or the "so what."

  At McKinsey, team members would typically ask each other "so what?" many times every day as they sorted through the myriad of data available for analysis. When the team is struggling to find an answer, it is quite likely that the data and the insights currently associated with them may not be very important, and that they could be a candidate for the appendix or the pile of interesting but irrelevant data and insights.

  RULE 2: THINK THROUGH THE IMPLICATIONS FOR ALL CONSTITUENTS

  This Rule of Engagement is closely related to the first rule, but it goes even further in specificity. The questions in the first rule have a yes/no answer regarding the relevance of the insight for the project—especially in terms of the client. The implications analysis in this rule goes deeper into the potential impact and broadens the scope of the investigation beyond just the immediate client.

  So who are the different constituents who should be considered in a consulting project? Essentially, there are three groups:

 

‹ Prev