Book Read Free

Inside Apple: How America's Most Admired--and Secretive--Company Really Works

Page 14

by Adam Lashinsky


  Jobs thought about far more than who would be the next CEO, however. In the same way that he obsessed on Apple products, he spent years preparing for ways to make sure his vision continues. Starting in 2008, as his health waned and he prepared for a liver transplant, Jobs created a management-training program, but one that was as different from programs that Hewlett-Packard and General Electric had offered as the iPad was from other tablets. Jobs already had some experience with in-house management training. Pixar University offers courses in drawing, painting, sculpting, and filmmaking, as well as leadership. Jobs was thinking beyond vocational skills. He wanted to record, codify, and teach Apple’s business history so that its future leaders would have a reference to ensure they thought different. With little fanfare, he created Apple University.

  Creating a management-training program seemed at odds with Jobs’s “stay hungry, stay foolish” persona—a counterculture persona he had cultivated since reading The Whole Earth Catalog. He had long denigrated the value of an MBA. He abhorred the concepts that gave business school professors their jollies, market research chief among them. He generally didn’t like MBAs, either. They had their place, but the people who mattered at an organization like Apple harbored passions [oreliver for science or art or music, not business. (Forgive Tim Cook, the night school striver who rounded out his credentials while at IBM, his MBA. He’s as exceptional in his way as Jobs, the Reed College dropout, was in his.) Pooh-poohing MBAs, though, becomes a problem once a company finds itself one of the largest in the world. It needs structure at that point. It needs leadership. It needs people who think about the business world.

  In 2008, Jobs hired Joel Podolny, then the dean of the Yale School of Management, to create Apple University. Podolny, an economic sociologist whose area of expertise is leadership and organizations, was not a typical tweedy academic. He had taught at Stanford and Harvard, but he exhibited very Jobs-like behavior when he became the head of Yale’s graduate business school in 2005 at the ripe old age of thirty-nine. Podolny had been a controversial dean during his time in New Haven. He revamped the school’s curriculum away from single-topic courses like marketing in favor of topics with wider scopes, like “the employee” and “creativity and innovation.” In keeping with Apple’s penchant for secrecy and no-profile, Podolny went into a kind of Witness Protection Program when he arrived in Cupertino—especially among his old friends on the Stanford faculty. “He has become, how shall I put it, super clammed up about Apple,” said Hayagreeva “Huggy” Rao, a Stanford business professor, who, like others among the Stanford faculty, said he didn’t see much of Podolny. Initially hired to create Apple University, Podolny later was promoted to vice president of human resources, despite never having run an HR department.

  Jobs himself had long ignored the HR function at Apple, choosing to focus primarily on recruiting, which he considered critical. He was attuned to the fact, however, that Apple was missing out by shunning general management and avoiding leaders with traditional business backgrounds. “We don’t hire a lot of MBAs, but we believe in teaching and learning,” he once said. “We do want to create our own MBA, but in our image. We’ve got more interesting cases than anyone.”

  Podolny hired a handful of additional professors such as Harvard’s Richard Tedlow, and they began writing cases about Apple. Tedlow, who is sixty-four, is the preeminent US academic business historian and is best known for chronicling the lives and careers of the most successful American entrepreneurs of modern times, including George Eastman, Henry Ford, and Thomas Watson. He took a leave of absence from Harvard, where he was the MBA Class of 1949 Professor of Business Administration, to consult at Apple. Then, in 2011—and without so much as a press release—he retired from Harvard after twenty-three years to take up a full-time position at Apple. “He told me he’s doing what he did here but that he’s doing it for internal Apple executives,” said Richard Vietor, a Harvard colleague.

  Examples of the case studies being taught at Apple University include the story of how Apple crafted its retail strategy from scratch and Apple’s approach to commissioning factories in China. Wherever possible the cases shine a light on mishaps, the thinking being that a company has the most to learn from its mistakes. Apple executives teach the cases, with guidance from the professors.

  In his own book Giants of Enterprise, Tedlow makes trenchant observations about the challenges great companies face:

  There is no other field of human activity—including entertainment, s [ert="2ports, high fashion, or politics—which is so riddled by fads as business. Every day there is a newspaper headline, every week there is a magazine story, and perhaps with the Internet we will soon be saying every hour there is yet another “guru” that touts a new hero of business or a new method of solving problems which date back not merely ten years but far longer. At the least, the study of business history can prompt an executive to ask of each new “solution” to problems that can never be solved but only managed: How really lasting is this approach, this idea, this company?

  Drawing comparisons among the visionaries he researched, Tedlow observes that the men who created these great enterprises suffered from “the derangement of power.” He notes, “It is very common among the very powerful and very destructive. Norwegians have, in fact, a word for this syndrome. It is stormannsgalskap, which can be translated as ‘great men’s madness.’ ” If any company could fairly be described as being influenced by a leader with stormannsgalskap, it was Apple.

  If Tedlow has been addressing the subject of great men’s madness with his students, word hasn’t yet leaked out. Instead, he is teaching them business lessons about other companies that the Apple executives can apply to their own situations. For instance, Tedlow has lectured Apple’s PR staff on the Tylenol tampering crisis of 1982 and how the McNeil Consumer Products unit of Johnson & Johnson responded. He taught a class for executives about the fallen grocery store chain A&P as an example of what happened to a company that once dominated its field. Quipped an attendee: “We were all trying to figure out what A&P had to do with Apple.”

  Apple spent years keeping academics out, so it will be interesting to see over time the effect of welcoming them in. Tedlow’s last book before joining Apple was Denial: Why Business Leaders Fail to Look Facts in the Face—and What to Do About It. Marketing material for the book notes that a common sign of denial is the act of “focusing on a glitzy new headquarters rather than the competition.” Apple hardly ignores the competition. Then again, the last time he spoke in public, on June 7, 2011, Jobs unveiled plans for a magnificent new Apple headquarters, which he likened to a giant spaceship.

  The effect of Apple University on the company’s corporate culture could take years to become visible to outsiders. Some of the first perceptible differences between the Jobs and the post-Jobs eras will be seen sooner—in areas that were outside Jobs’s interest, or areas where the company’s shortcomings were directly attributable to him. Apple hardly was a perfect place under Jobs, so while his death represents a great loss, it also presents an opportunity. For example, a dirty little secret inside Apple is that Jobs was a one-man bottleneck. Steve Jobs was all too human, after all, and there was only so much he could do in the course of a day. Employees liked to say there were two kinds of projects at Apple: the ones Steve Jobs obsessed over and all the others. In fact, Apple tends to be a one-big-thing-at-a-time company, reflecting the legendary CEO’s willingness to concentrate only on one big thing at a time.

  When Jobs was CEO, a former Apple engineer described this phenomenon in predictably computer-scientist lingo: “He operates in a single-threaded manner. Other things will get put on hold.” When the first iPhone was under development, for example, the scheduled update of the operating [he manner. Osystem for the Macintosh was delayed by months because of the resources pulled to focus on the first mobile operating system.

  Jobs’s refusal to spread himself too thin at Apple—a problem alleviated only somewhat when he sold Pixar to
Disney in 2006 and stopped spending a day a week at the animated film company across the San Francisco Bay—was consistent with how he wanted Apple to run. Generally speaking, Apple doesn’t multitask. The lower down employees are in the ranks, the more they focus on one project. The virtues of this approach are evident in Apple’s exceptional and limited product lineup. But having a singularity of focus has downsides, too. Apple is a sprawling, multiproduct company now. There’s reason to believe that less visionary managers will be willing to keep more balls in the air—at a time when Apple already is juggling more balls.

  Another little-discussed topic at Apple, given its success, is what could be called its orphan products, the features that Apple plainly doesn’t care all that much about. During Jobs’s tenure, employees knew the reason when one of their projects seemed to simmer on the back burner: Jobs wasn’t interested. An example is the inferiority of Apple’s spreadsheet program, Numbers, compared with its stellar presentation software, Keynote. “Keynote is a wonderful application because Steve did presentations,” a departed engineer pointed out. “Numbers doesn’t ooze Steveness, which makes sense, because Steve didn’t do spreadsheets.” Indeed, in the context of extolling the virtues of having one person, the CFO, keep a spreadsheet for the company’s finances, Jobs once boasted: “Nobody walks around with spreadsheets anymore.” It’s a ludicrous statement, of course. Tim Cook is a master of spreadsheets, and there’s no way a legion of Apple managers working on projects from real estate to logistics to manufacturing could function without them. But the sentiment nevertheless reflected Jobs’s attitude, and in fact Numbers isn’t a real rival to Microsoft’s Excel. If Apple wanted to make a serious effort to court business users for its computers, creating a better spreadsheet program would be a step in the right direction.

  Whole sectors of the company were ignored when something else had caught Jobs’s fancy, and they typically were slower-growing units. Macintosh computers faced this fate, for instance. Employees are completely aware of the phenomenon, and many who leave the company cite having found themselves in an un-hot corner of Apple with no opportunity to move.

  A new regime at Apple may institute subtler and more salubrious changes. Technology wonks like to gripe that Apple’s products look more beautiful than they are. In other words, Apple is accused of sacrificing mechanical design for industrial design. It’s a debatable point, as these same critics typically will say that Apple’s less-than-perfect products are still better than anyone else’s. Fair or not, Apple’s emphasis on aesthetics over functionality is directly attributable to the leadership of Steve Jobs. If there is room for improvement here, his absence may provide the opening.

  A post-Jobs Apple also may enter the modern era of financial management. Jobs for years was insistent that Apple maintain a strong balance sheet, so fearful was he of reliving the late-1990s experience of nearly going broke. He loathed stock buybacks, arguing, with good reason, that they are bribes to investors rather than good uses of capital. Keeping more than $75 billion lying around is nobody’s idea of good financial management, however. And Wall Street types have all sorts [ave Ke of suggestions for how Apple could do better here, such as paying dividends or investing the cash more aggressively. Such topics were considered off the table with Jobs. He treated cash as if he had lived through the Great Depression. What investors would view as modern balance-sheet management would have to wait for a CEO with an MBA. Tim Cook has an MBA, and he speaks regularly to investors, which is a start.

  There’s also the hint of evidence Apple can become a kinder, gentler place in the post-Jobs era. One of Tim Cook’s first official acts was to offer a corporate philanthropic matching program for employees. Jobs was notoriously stingy when it came to giving away money. He argued privately that the most philanthropic action Apple could take was to increase the value of the company so shareholders could give away their wealth to the causes of their choice, not Apple’s. Given his politically liberal leanings, Jobs reasoned that investors would prefer things that way. (Laurene Powell Jobs was even further to the left of her husband. Jobs joked to his biographer, Walter Isaacson, that he needed to “hide the knives” before inviting the right-wing News Corp. chairman Rupert Murdoch over for dinner.) Nevertheless, two weeks after becoming CEO, Cook told Apple’s US employees that the company would match gifts to charities up to $10,000 annually. “Thank you all for working so hard to make a difference, both here at Apple and in the lives of others,” Cook wrote in a companywide email. “I am incredibly proud to be part of this team.”

  Philanthropy and a spreadsheet program to compete with Microsoft are just some of the tea leaves that optimists about Apple’s future bring up when they talk about the company after Jobs. Certain quirks will be ironed out for the better without him, they say.

  There is a pessimistic view, too, that Apple will become less dynamic, its products less coveted, without Jobs. The glass-is-half-empty crowd envisions a scenario in which the pipeline of devices that we don’t even know we want yet runs dry in a few years. “Apple designed for Steve,” a former Apple software engineer said. “It is not an exaggeration. Steve was the user that everything orbited around and was designed for.”

  The entrepreneur Mike McCue, who never worked at Apple but is one of those start-up junkies who long idolized Jobs, tells a story that illustrates the Steve-as-linchpin perspective. “I once spoke with Jony Ive about how wonderfully connected Apple’s whole product line was,” McCue said.

  I was standing in an Apple store, back when they came out with their first set of new Macs and OS X [Apple’s desktop software]. And I remember looking up at the screen, and their website had these sort of gray translucent lines thematically in the website. And if you ran your eye up the screen, up to the menu bar of OS X it had these gray translucent lines. And then you ran your eye up even further in the cinema display and they had these gray translucent lines. And then I looked over to my left and there was a barrier, a glass barrier that separated [the different areas of the store] and it had these gray translucent lines. And I asked Jony, “How did that happen? Who does that at Apple?” And he was like, “Steve does that.”

  Jobs also dominated Apple in an intangible way. He was the final arbiter on matters of taste. A former Apple engineer who left for a Silicon Valley start-up described [up intangthe differences in how math-oriented Google and design-oriented Apple work. When Jobs was CEO, he made decisions on matters as routine as the color palette for a website. “Let’s say Google is trying to determine the correct color for a new page,” said the engineer. “It will order an analytical test by serving up various shades of blue to one million Google.com users and then analyze the click-through rates.” Google, in other words, takes a democratic approach: Users can’t be wrong, and they vote with their clicks. What’s more, were an engineer even to have an opinion about the correct shade of blue, he’d be outvoted by the user analysis. At Google, crowdsourcing rules.

  User democracy is the antithesis of how Apple operates. Jobs famously told customers what they wanted. He didn’t ask their opinion. “The Apple way is that Steve picked the color he liked and that’s the color,” the former Apple engineer concluded. “He was willing to listen to counterarguments. But if you [were] arguing taste or opinion, it was a losing battle.” This view of Apple as a kind of consumer-electronics fashion house leaves little hope for a new creative and entrepreneurial genius to rise from the ranks. After all, with Jobs calling the shots on matters of style across the company, his subordinates won’t have been able to try their hands at the game.

  Finally, there is a third view—the grand hope of Apple’s supporters, the optimistic viewpoint—that Steve Jobs so thoroughly stamped the company with his DNA, the fledglings are ready to fly on their own. Therapist/business coach Michael Maccoby, the expert on visionary and narcissistic leaders, identified indoctrination as one of the productive narcissist’s primary goals.

  The narcissistic CEO wants all his subordinates to think the way he
does about the business. Productive narcissists—people who often have a dash of the obsessive personality—are good at converting people to their point of view… [Jack] Welch’s strategy has been extremely effective. GE managers must either internalize his vision, or they must leave. Clearly, this is incentive learning with a vengeance. I would even go so far as to call Welch’s teaching brainwashing. But Welch does have the rare insight and know-how to achieve what all narcissistic business leaders are trying to do—namely, get the organization to identify with them, to think the way they do, and to become the living embodiment of their companies. [Emphasis added.]

  As I’ve noted, legend has it that in the years following Walt Disney’s death in 1966, top Disney executives were known to ask, “What would Walt do?” But the Walt Disney Company is a cautionary tale for students of Apple because Disney declined precipitously when Walt was gone. In the years after Disney died, his lieutenants pumped out a final volley of classic old-school Disney animated musicals—Walt’s pipeline of products. The Jungle Book in 1967 was one. But then the output got spotty and weird (The Black Cauldron, The Great Mouse Detective). It really wasn’t until 1988, with Who Framed Roger Rabbit?, and The Little Mermaid the following year, that Disney’s animation efforts got back on track. These films triggered the Disney renaissance, but there were a lot of elements in both of them Walt might not have approved of. The voluptuous Jessica Rabbit comes to mind as well as the sea witch, Ursula, whose body was based on the drag queen and John Waters staple Divine.

  Even with these successes, under the leadership of an executive hired from Paramount, Michael Eisner, Disney fell so far behind on innovation that it had to buy Pixar. That company, funded by Steve Jobs, did see the future of computer-aided animation, forcing Disney to play catch-up on the latest technology in a field Disney invented.

 

‹ Prev