Creative Construction

Home > Other > Creative Construction > Page 11
Creative Construction Page 11

by Gary P Pisano


  The Assumptions Behind the Eat Your Own Lunch Logic

  If you are going to follow (or discard) advice, you should understand the logic behind it. The logic of “eat your own lunch” rests on two fundamental assumptions. The first, which is often not true, is that you can perfectly predict the future. You think you know for sure that a particular transformative innovation will make your current technology or business model obsolete. But, as we saw with IBM, technology trends in reality are very hard to predict. The second assumption behind the eat your own lunch argument is that the potential disruption in question offers profit opportunities that are at least as good as your alternatives. That is, it presumes you are better off abandoning your current market position to embrace the new regime because if you do not, you will go bankrupt. But, as IBM discovered, PCs were never as profitable as mainframes. Deciding how to respond to a potential disruption threat requires you explore whether these assumptions hold. Let’s explore them in more detail.

  Assumption 1: You Can Predict the Timing of Disruptive Threats

  Those who preach the eat your own lunch gospel have plenty of examples to illustrate the dire consequences of failing to follow the advice. They can point to all those traditional bricks-and-mortar retailers who failed to embrace the Internet and were wiped away by the likes of Amazon. They can describe the downfall of traditional media companies who were slow to realize how Google and other online platforms would ultimately disrupt the advertising business model of the industry. They can evoke the ghosts of once-dominant companies like Nokia, RIM, Blockbuster, DEC, Wang, Kodak, Polaroid, and many others who were swept away by the Schumpeterian gales of creative destruction. In each of these cases, so it seems, leaders could not bring themselves or their organizations to swallow the bitter reality facing their industry. They could not jettison their past in order to have a better future. They could not, in short, bring themselves to eat their own lunch. What else could explain such suicidal behavior?

  These cases can be persuasive and frightening. When I read them, I am reminded of watching a horror movie—the type where creepy villains hide in the basement. As you see the story unfold, you want to shout the business equivalent of “don’t go in the basement” or “get out of the house now”! But, just as they do in the movie, the leaders in these cases ignore your pleas, and bad things happen. It all seems so predictable.

  And yet, at least in the case of innovation, it is not. It only tends to look that way after the fact. In reality, technological and business model evolution is a messy and uncertain process—full of dead-ends, wrong turns, and random effects. What might look obvious in retrospect was by no means obvious at the time leaders of these enterprises peered into the future. Predicting technological trends is brutally difficult. Even the most visionary business leaders can get it wrong.

  The Future of Electric Cars

  A visionary technologist sits across from the journalist and calmly proclaims that the electric vehicle will be the car of the future. There is good reason to believe the speaker—he has been a successful entrepreneur, with deep technical knowledge of the trends transforming the contemporary economy. Surely, I must be talking about Elon Musk, founder of electric car producer Tesla, Solar City, and PayPal. But I am not. The technology visionary in this case is Thomas Edison, who in 1914 told an interviewer, “I believe that ultimately the electric motor will be universally used for trucking in all large cities, and that the electric automobile will be the family carriage of the future.”3 And Edison was not the only visionary who believed in the future of electric cars. Henry Ford was also enthusiastic about their future. Ford collaborated with Edison to develop an electric car, buying more than 100,000 of Edison’s batteries and committing more than $30 million (in today’s dollars) into the car’s development—over a fifth the size of the $135 million investment that a much larger Ford Motor Company made in electric vehicles in 2010.4 As late as 1917, the Wall Street Journal published an article about a new electric automobile company with the title, “Another Wonder of the Age on the Threshold: Dey Electric Automobile, Noted Engineers Say, Promises to Revolutionize the Auto World,” in which a world authority on alternating current argued the electric car would lead the automotive field with low prices.5 Noting increases in electric cabs, another 1917 article from the same newspaper pondered whether, “should the electric vehicle assume sufficiently great proportions, it is possible that many cities may compel drivers of cars to use electric power within city limits rather than the smoky and noisy gas motor.”6

  Of course, as we now know, by the early 1920s, the gasoline internal combustion engine became the entrenched dominant design for automobiles. This happened because of a confluence of forces, not all of which would have been predicted at the time. Technical improvements in internal combustion technology made the engines quieter, cleaner, and smoother. The invention of the electric starter by Charles Kettering in 1912 eliminated the inconvenience (and danger!) of having to start the engine with a hand crank.7 That same year, perhaps not coincidentally, Ford introduced the Model T for $850.8 The internal combustion engine was also boosted by the dramatically expanding availability of gasoline, thanks to a combination of greater domestic crude oil production and refining capacity.9 The expanding supply of crude drastically reduced the price per mile of gasoline vehicles. One advantage gasoline cars had over electric vehicles at the time (and even today) was greater range. This range advantage, however, was of little use to consumers until there were enough roads to take them longer distances. Thanks to federal support of road building and maintenance (in no small part influenced by the growing popularity of the Model T), the US road system (particularly in rural areas) expanded dramatically between 1916 and 1921.10 The expansion of the road system made the gasoline-powered car an alternative to the railroads’ monopoly on long-distance and interstate travel. While charging stations for electric cars existed in the 1910s, much of the country did not have electricity, let alone standardized electricity (AC vs. DC).

  Why You Should Take Innovation Predictions with a Grain of Salt

  The world is awash in predictions about future technologies and business model innovation. You hear them at industry conferences and in TED Talks. You read them in blogs, books, newspaper articles, tweets, and consulting reports. In fact, there is a whole cottage industry of people dedicated to making these kinds of predictions. They are called “futurists.” Futurists are usually the ones cautioning you about the next big disruptive wave that will kill your business. Sometimes they are right—sometimes they are wrong. And that’s the problem. It turns out to be really hard to predict technology and business model trends.

  There is a tendency to emphasize how quickly things are changing and how different the future will be from the present (a predictable bias of people who, after all, call themselves futurists). But as with mainframe computers, supposedly obsolete technologies can be surprisingly tenacious. There are other examples. The transition from sail power to steam power for ocean going vessels took more than sixty years.11 The first electronic fuel injection for the car was introduced in the 1950s, but it was not until the 1980s that it overtook carburetors, thanks largely to tighter government standards on emissions.12 The first commercial airplane to use a jet engine (instead of a propeller) was introduced in 1952, yet propeller aircraft engines are still being designed, manufactured, and used for smaller, shorter-haul aircraft today.13 When Steve Jobs introduced the iPad, he predicted tablets would replace notebooks. In 2013, tablet sales surpassed notebooks, partially validating his vision.14 But, with unit sales of more than 160 million in 2017, notebooks have by no means gone away.15 Internet-based banking has surged in the past twenty years, leading many to expect the end of the branch. Yet, the number of branches in the United States continued to grow steadily until 2009 (when it peaked at 35.72 per 100,000 adults). The number of bank branches in 2014 (32.40) in the United States was essentially the same as it was in 2004 (32.5 per 100,000 adults).16 We all know abo
ut the decline of print newspapers as digital channels have ascended. While it is true that print circulation has been declining for years, print still made up 78 percent of daily circulation and 86 percent of Sunday circulation in 2016.17

  But it is also just as easy to underestimate how quickly things do change. Consider what happened in the smartphone market. In 2008, a year after Apple introduced the iPhone, Nokia’s proprietary operating system for smartphones, Symbian, still commanded a 48.8 percent share of the market (a slight increase from the prior year).18 So if you were running Nokia at the time, you must have been feeling pretty good—Apple’s much-vaunted iPhone had no negative effect on your share. But then came Google’s Android operating system, introduced in 2008. Within just four years of its introduction, Android’s share soared to 74.4 percent, Apple’s iOS held most of the remaining with 18.2 percent, and Nokia’s Symbian vanished to a 0.6 percent market share.19

  There are other examples of rapid disruption. Video on demand displaced the market for rental DVDs within about three years of introduction. Within eight years of its founding, Amazon was selling more books online than Barnes and Noble was selling through its stores. In 2004, about 90 percent of US households had a wired landline for telephone service. By 2014, only slightly more than half did—households relying on cell phone telephone service only jumped to about 45 percent.20

  Prediction is always inherently difficult, but predictions about technology and business model trends are particularly vexing for three reasons. The first is what I call the “systems problem.” Few technologies exist in isolation. They are part of more complex systems. This means that the economic viability of any given technology depends on the interaction of multiple complementary technologies and economic forces, each subject to its own uncertainty. The performance of complex systems tends to progress in a nonlinear fashion. Inflection points are extremely hard to anticipate because their timing is influenced by the interaction of multiple factors. For instance, if you want to predict the market penetration of electric vehicles in the next ten years, you have to get your arms around trends in battery technology (in itself a highly complex space), advances in electric motors, materials that affect vehicle weight, potential advances in internal combustion technology, advances in fuel technology, the cost of gasoline, the cost of electricity, future government policies, and future customer preferences, to name just a few factors.

  Such system interdependencies are also at work in business model innovation. Netflix initially became viable because the video storage technology shifted from bulky VHS cassettes to cheap-to-mail DVDs. But television screens were also getting larger and sharper, thanks to advances in display technology and graphic circuitry, and this made watching movies at home a more pleasurable experience. This increased demand for at-home viewing fit perfectly with Netflix’s fixed price (“all you can eat”) subscription pricing model. This all looks so predictable now, but at the time it was not. In fact, Netflix’s original business model did not include a fixed subscription price—it was a per-film rental like Blockbuster. It was not until later—after disappointing market penetration—that Netflix shifted to the subscription model.

  The second impediment to technology and business model predictions is what I call “endogenous customer preferences.” The textbook description of entrepreneurial innovation is one where a visionary leader “sees” an unmet need and then develops a breakthrough innovation to serve that need. Reality is a bit messier. Customer needs and preferences are neither static nor given. We discover what we want by seeing things that we never imagined. This means that it is hard, ex ante, to determine market demand for needs that do not currently seem to exist. Intel did not invent the microprocessor because its leaders saw the potential to transform the world’s economy and turn itself into a colossus of the digital age. The story has more prosaic origins. In 1969, Bussicom, a Japanese calculator company, asked Intel to design a set of twelve integrated circuits to handle the logic (mathematical functions) for a new line of calculators. Intel engineer Ted Hoff got the idea of combining the functionality of all twelve chips into a single chip—a device we now call a “microprocessor.”21 But it is not as if Intel immediately understood the potential applications of its new invention. In fact, originally, the rights to the microprocessor were assigned to Bussicom. It was only later that Intel bought back the non-calculator rights to the technology, in what must be considered the greatest business development deal of all time. But, even then, Intel did not fully understand the potential markets for microprocessors.22

  Initially, the uses for microprocessors appeared limited. They were underpowered relative to traditional multichip modules used in existing mainframe and minicomputers. The market for personal computers had not yet been created. When the first PCs did hit the market, Intel’s visionary leader, Gordon Moore (after whom Moore’s Law is named) did not see its potential. He dismissed home computers as devices that might find limited use as a way for “housewives to store recipes.” The point is not that Intel was incompetent. It clearly was not. The ultimate big market for Intel microprocessors was one that did not exist, and forecasting a market that does not exist, or is in its infancy, is nearly impossible. None of us knew we needed a personal computer until we had one. Customers do not always know their own preferences, and they reserve the right to change them any time they please without notice!

  Finally, whether and how fast a new technology triumphs depends not just on its own (uncertain) rate of progress, but also on the (uncertain) rate of progress of the technologies it seeks to replace. There is often a presumption that technologies that have existed for a long time are somehow “out of gas.” When confronted with “new” technologies, they will quickly collapse. But, as documented by the work Dan Snow of Brigham Young University, old technologies often continue to progress and improve quite dramatically in the face of potentially disruptive innovations, a phenomenon he called “last gasps.” For instance, carburetors continued to improve after the mass introduction of electronic fuel injection in the early 1980s, a task partly accomplished by integrating some elements of electronic fuel injection into their carburetor design.23 Snow’s work shows that such last gasps can be quite prolonged.

  The point is not to urge you to ignore technological or business model threats, nor is it to dismiss ominous forecasts. That would be myopic, and myopia is never useful. Companies do get destroyed by radical changes in technology and by disruptive business model innovations. Instead, I hope to have sensitized you to the delicate balance you have to strike in responding to potential threats. On the one hand, something may not look like much of a threat today that could quickly become so due to unexpected changes in complementary technologies, regulations, or markets. On the other hand, you don’t want to prematurely abandon existing technologies and business models that may actually have long economically productive lives in front of them. Abandoning an existing strong position too early can be just as costly as abandoning too late. We need to accept that our radar screens are much foggier than we realize. Uncertainty has important practical implications for how we conduct innovation strategy. Later in this chapter, we will examine these. Before we do that, however, we must examine another critical assumption of the “eat your own lunch” logic.

  Assumption 2: All That Glitters Is Gold

  Let’s start with a thought experiment. Like many such experiments it stretches reality a bit by endowing you with unparalleled powers of prediction. I am going to assume that all the challenges of technology and business model prediction we examined above do not apply to you. You can predict the future perfectly. Now let’s assume you are CEO of Kodak in 1983, and, because you are such an omniscient visionary, you can see the future of digital imaging perfectly. You know how fast it will improve; you know the rate that all the required complementary technologies will improve; you know the rates of investment of all your competitors. You have exact dates when images of different quality will be available, and you even know the p
rices. You are pretty much a wizard. And, obviously, you can see the dire implications for your core film business.

  If there was ever a time to eat your own lunch, this would appear to be it. Abandon film and go headfirst into the digital future. Right? Not so fast. There is another assumption made the by “eat your own lunch” logic that also needs to hold up before you leap. You have to assume that the new technology (say, digital imaging) is as profitable as your existing technology (say, film) or all the alternative things you could do with the shareholders’ money (like give it back to them). That may or may not be true, depending on the circumstances. There is no economic law that says new technologies offer the same profit potential as old technologies.

  Digital imaging is actually one case where technological change dramatically reduced the profit potential of the business. The usual story about the fall of Kodak is that it was too slow to respond to digital technology because it was so wedded to its traditional business—film. That is, in theory, Kodak failed because it did not eat its own lunch. But this story has been convincingly challenged by the recent work of my Harvard colleague Willy Shih.24 Willy should know the Kodak story well, because he joined Kodak as an executive in 1997, just as digital photography was ascending. As Willy describes it, Kodak was not slow at all in responding to digital. It was tracking trends carefully and investing heavily in critical digital technologies (like sensors).25 The problem was that digital technology transformed the structure of the photography market in a way that completely eroded its profit potential. Digital photography was based on general-purpose semiconductor technology available from multiple suppliers. As Shih notes, “Suppliers selling components offered the technology to anyone who would pay, and there were few entry barriers. What’s more, digital technology is modular. A good engineer could buy all the building blocks and put together a camera.”26 With such low entry barriers, it is not surprising that hundreds of companies (many small start-ups like GoPro) were able to enter the market. Moreover, broad availability of the core technologies made it difficult for companies offering digital cameras to differentiate their products from others. Low barriers to entry plus low opportunities for product differentiation equals very low opportunities for profit for everyone. Eating your own (very profitable) lunch is not necessarily a good strategy when the alternative is table scraps.

 

‹ Prev