Lazonick is a fan of some parts of HBS, including the cross-disciplinary conversations that take place at the School, a rarity in higher education. And he does think that there are some very smart people there doing very interesting things. “There will always be somebody at HBS—a Clayton Christensen, a Michael Porter, or a Josh Lerner—who is doing their part at helping us understand the really big picture,” he says. “But I have never seen a case where one of them establishes their niche and is subsequently able to tolerate critical debate. You saw where mine ended with Jensen, but it’s not just him; it’s everyone.”
In recent years, Jensen has been on a futile quest to alter a legacy that’s been carved in stone at this point. A kinder, gentler Jensen showed his face in 2011 with his paper “Putting Integrity into Finance.” Along with coauthor Werner Erhard, Jensen claimed to have revealed “a heretofore unrecognized critical factor of production.” That factor? Integrity. Plowing forward with his “new” ideas about integrity, Jensen claimed to offer “an actionable pathway to increasing integrity,” and concluded that “integrity thus becomes a necessary (but not sufficient) condition for value maximization and for a great life.”23 There may be no better evidence for the pitfalls of the tenure system than this.
Werner Erhard, it should be noted, is the man behind the “est” (for Erhard Seminars Training) movement, which lasted from 1971 through 1983, and the Forum, which stretched from 1984 through 1991. The gist of est was an attempt by participants to understand who they were at their core, which dovetailed nicely with Jensen’s insistence that we are all whores. Read closely and you’ll see that the language Jensen used to describe his early courses at HBS sounds more like something you’d hear at an est retreat than a university curriculum: “We do not just teach; we are devoted to making a difference in people’s lives. But a change in behavior almost always means giving up an associated way of thinking, and giving up a way of thinking is almost always a painful process. Providing students with an opportunity to learn material that fundamentally changes the way they see the world generates confusion, discomfort, and controversy.”24
And get a load of this gem of an argument: “As an example of the confrontation of emotions with reason, we use the case of Michael Milken and Drexel. Who, we ask our students, has created more value in this century than Milken? Was Milken overcompensated for that effort (even after legal penalties) by his contract with Drexel, which gave him a $50,000 salary and approximately one-third of the profits of the division he founded at Drexel? Overcompensated according to what standard? How was Drexel’s decision to break its contract with Milken different from (for example) a machine shop’s refusing to honor its contract with its workers when they ‘make too much money’? What will each of these actions do to break the level of trust in an organization which is so necessary to its functioning? Was it Drexel’s failure to take the reactions of other constituencies (such as the public’s envy of Milken and others like him) into account that caused the destruction of the firm and the resulting loss of 10,000 jobs? Where was the ‘morality’ in that sequence of management decisions, or in the attacks on Milken by competitors, government officers, and others whose wealth and power were threatened by the profound changes in capital markets that he induced?”
The problem with enumerating the many absurdities in that paragraph is like that of Michael Jensen’s imaginary CEO who freezes up when confronted with competing priorities: It’s hard to know just where to start. As George Orwell observed, “There are some ideas so absurd that only an intellectual could believe them.” And some ironies, too, including that of a tenured professor acting as an apostle of the free market.
What did Michael Jensen achieve, in the end? In short, he helped an entire generation of businesspeople lower its opinion of itself, and give in to its baser motives. For all his economic equations and insistence on the testability and refutability of the logic of his opinion—and it was simply that, an opinion—he released CEOs, institutional investors, and Wall Streeters from the obligation of considering anything but their own narrow wants and needs.
Not only that: Whatever “testing” of his ideas Jensen had managed practitioners to engage in, what happened in the end was that management adopted those parts of agency theory that suited their needs and simply ignored the rest. In this particular instance, HBS’s claim to be “closer” to managers than other business schools can certainly be validated, although only as a courtier of useful self-serving ideas. American managers loaded their companies with debt, per Jensen, they started paying themselves in equity and options, per Jensen, and they did everything they could to juice the value of that equity. And in doing so, they sacrificed long-term value for short-term gain, going so far as to engage in outright fraud to bring it about.
Here’s one thing Jensen didn’t talk about much when he talked about the wondrous world of hostile takeovers: the insider trading that it unleashed. The crimes first came to light in 1986 when Dennis Levine of Drexel Burnham Lambert was arrested, accused of making $12.6 million from insider trading, and charged with obstructing justice and attempting to destroy records. From there, everything snowballed. Levine implicated Ivan F. Boesky, an arbitrageur, who then implicated Martin Siegel (’71), formerly of Kidder Peabody but then working for Drexel Burnham Lambert. And here’s a fun fact for your next cocktail party: It was insider trading in the shares of none other than Enron, which Michael Milken had helped finance, that tipped investigators off to Levine in the first place.25 A number of HBS graduates were ensnared in the ensuing investigation, including Siegel, Paul Bilzerian (’77), and Ira Sokolow (’81).
Fred Joseph (’63) was CEO of Drexel when it was brought to its knees during the scandal, although he denied any involvement in the insider trading schemes and claimed he was only guilty of “surprising naïveté.” The feds bought his alibi, and the Securities and Exchange Commission merely reprimanded him for failing to properly supervise his star employee. “He was the captain of the ship, and in navy terms, if you’re the captain and the ship has trouble, the captain takes responsibility—he always took responsibility,” Joseph’s classmate Charles D. Ellis, who worked with him at Drexel, told the Harvard Crimson, apparently confusing Joseph’s complete denial of any responsibility with its opposite.
The School has of late added a course on Leadership and Corporate Accountability, and the description of the topics for examination within it sound like a direct repudiation of Jensen: “decisions that involve responsibilities to each of a company’s core constituencies—investors, customers, employees, suppliers, and the public,” with discussions on insider trading rules, the fall of Enron, human character, employee responsibilities, labor laws, corporate citizenship, socially responsible investing, and serving the public interest. But in this, its influence can clearly be seen as akin to pushing on a string.
“[While] the average firm has assiduously applied the agency theory principles that increase corporate entrepreneurialism and risk,” write Frank Dobbin and Jiwook Jung in their 2010 paper, “The Misapplication of Mr. Michael Jensen: How Agency Theory Brought Down the Economy and Might Do It Again,” “it has not applied the principles that bolster monitoring and foster executive self-restraint. Executives have not been required to hold more equity, and boards have not truly gained independence. These changes encouraged corporations to invest in riskier ventures and to deceive shareholders in the early years of the first decade of the twenty-first century, producing high-profile failures at Enron, WorldCom, and Tyco, and contributing to the recession of 2001. These changes then encouraged banks, insurance companies, and industries to assume speculative risk at the end of that decade, contributing to the Great Recession. There was nothing to prevent firms from pursuing excessive risk, and little to prevent them from committing outright fraud, and there is still little to restrain either pattern of behavior.”
It almost feels unfair to pick on Michael Jensen. He came up with a neat and tidy theory that appealed to the economic m
ind, and he ran with it. But he created a Frankenstein that no one knows how to kill. In a 2005 interview with the New York Times, Jensen admitted as much. In a discussion about the propensity of executives to be overoptimistic about forecasts that support lofty share prices, he said, “[If] executives would present the market with realistic numbers rather than overoptimistic expectations, the stock price would stay realistic. But I admit, we scholars don’t yet know the real answer to how to make this happen.”26 That’s called ethics, you will recall, and Jensen is right: They don’t know how to teach it as well as they know how to teach financial engineering, and they never will. When Jensen finally has the decency to leave HBS, they will finally be forced to come to terms with their embarrassment at having hired him in the first place.
But it wasn’t only HBS. By the time the 1980s were finished, the majority of MBA degrees lacked any pretense to higher ideals and had become nothing more than a personal ad suggesting the graduate was a mercenary for hire who was committed enough to fighting that they’d invested in the expensive weaponry known as the MBA. It didn’t even matter what war was being fought. Said the late Harold Leavitt of Stanford: “The new professional MBA-type manager [in the 1980s] began to look more and more like the professional mercenary soldier—ready and willing to fight any war and to do so coolly and systematically, but without ever asking the tough path-finding questions: Is this war worth fighting? Is it the right war? Is the cause just? Do I believe in it?”27
HBS could hardly claim that they did, either, because in arming their graduates to join the forces of investment banking, private equity, and consulting in their mission to roll back the last vestiges of the managerial revolution, they were acting like modern arms dealers who have no compunction in selling weapons to both sides of a single conflict—they were literally sending their new graduates out to fight against their old ones.
But there was one important change: Able to understand the seismic shifts in the structure of corporate America, MBA graduates no longer sought job security—they wouldn’t have found it if they had—but began focusing almost exclusively on opportunities for rapid promotion and high compensation. Lost in the transition: any notion of a sustained effort to build organizations that create useful products and services, provide employment, and contribute to their communities.
It wasn’t that they weren’t smart enough to understand the choice that they were making; it’s just that the money was too good. By the end of the 1980s, salary levels for MBAs were skyrocketing while real wages for the average American worker were in decline. In 1978, the New York Times described a Harvard MBA as a “golden passport” to financial well-being. By 1980, the starting salary for an elite MBA student was almost double the U.S. median pay level.
43
Managerialism Was Already Dead
Lawrence Fouraker was prone to philosophical flights of fancy in his annual reports to the president of Harvard when he was dean of HBS, and his 1973–74 report was a prime example of such. In it, he compared what he called the three primary economic systems of the United States: capitalism, managerialism, and socialism. Capitalism was the profit motive, its primary mechanism the market, and its sovereign the consumer. Managerialism was the response to capitalism’s success, which had resulted in two kinds of markets—external (that is, consumers) and internal (the “economy” within a firm itself). The manager was supposed to have some sort of exclusive technical ability required to mediate between the two. Socialism was the nonprofit part of the economy—government, education, and health care.
Fouraker was writing at the apogee of confidence in managerialism, and it showed. “Conflicts are resolved in the managerial system by problem solving behavior. . . . This is the major function and operating responsibility of a manager. The diversified corporation has many constituencies: shareowners, customers, competitors, employees, regulatory agencies, creditors, neighbors. Management engages in continuous problem solving to come to terms with each group in a coherent fashion, consistent with the long run well being of the constituents and the corporation.”1 After acknowledging the emerging squabble between the three systems, he then looked to the future. “Perhaps the easiest coalitions to form will be those that merge parts of the capitalistic and managerial; the most difficult probably will be efforts to bridge between socialism and capitalism.” He was right about that last point—to this day, the tension between capitalism and socialism continues to be a vexing one for the country. Consider the passage of Obamacare. But he was dead wrong about the former—capitalism and managerialism did not form a coalition; capitalism chewed managerialism up and spit out the bones.
In defense of Michael Jensen, then, it is somewhat of an overstatement to suggest he participated in the murder of managerialism, because by the time he was spreading his ideas about corporate and managerial purpose, managerialism was already dead. Jensen just served as the stake driven through the heart of its corpse. It is thus more accurate to say that Jensen helped ensure that managerialism stayed dead.
Mark Mizruchi, a professor of business administration at the University of Michigan, lays out, in painstaking detail, the collapse of the managerial ideology in his 2013 book, The Fracturing of the American Corporate Elite. After acknowledging that America’s corporate leaders have never been among its most progressive or socially minded, even at their post–World War II best—they “presided over a system rife with poverty, racism, repressive social norms, and a smug, uncritical attitude in which they refused to acknowledge the problems they had created or exacerbated”2—he makes a compelling case that there really was a time when America’s managers, which obviously includes its legion of MBAs, could be said to have acted not just in the best interests of their companies but of society itself.
Mizruchi pinpoints the moment that corporate executives abdicated a true leadership role in American society as sometime in the early 1970s. After listing a number of powerful societal forces that brought them to that moment—the rise of “neoliberalism” and its emphasis on free markets and rejection of any government intervention in the economy; the triumph of “shareholder value” over managerial prerogative; and the underlying changes in the American economy itself, away from manufacturing toward services, and from industrialism to financialism—he argues that such forces didn’t make it inevitable that corporate leaders had to stop fighting the good fight. American corporations, he points out, may have more political power in the early twenty-first century than they’ve had since the 1920s. What they don’t have—and what he suggests is the root cause of the above changes—is a willingness to “mount any systematic approach to addressing even the problems of their own community, let alone those of the larger society.”3 In other words, when the shit hit the fan in the 1970s, America’s corporate leaders abandoned any pretense to the role that Wallace Donham had laid out for them as the enlightened custodians of not just the world’s most powerful economy, but of the country itself.
He presents a compelling, if depressing, argument in support of his thesis. When the going was good, American management, which has never been described as liberal, nevertheless bought into a moderate approach to politics, one that included grudging acceptance of labor unions and government regulation and intervention in the economy. A so-called Keynesian consensus emerged in the 1960s, in which the theories of economist John Maynard Keynes held sway. While a discourse on competing economic ideologies is beyond the scope of this book, for the sake of argument, the consensus was that the private sector wasn’t the faultlessly efficient driverless vehicle other economists have argued it to be, and that sometimes active policy responses, whether fiscal or monetary, were necessary to keep the economy from driving itself right off the road during the inevitable (and jarring) turns in the business cycle. Keynes argued for a mixed economy, one that was still driven by the private sector but kept inside the yellow lines by occasional government intervention, particularly during recessions. “By 1971,” writes Mizruchi, “a majority of top corp
orate executives expressed support for both Keynesian deficit spending and the idea that the government should step in to provide full employment if the private economy was incapable of doing so.”4
When the going got tough, as it did in the early 1970s—when high government spending, the emergence of foreign competition, and the energy crisis created so-called stagflation, the unprecedented combination of slowing growth, high unemployment, and high inflation, a situation that was compounded by the legitimacy crisis in American institutions of the time—the ability of corporate executives to rise above their own narrow self-interest was put to the test. And they failed.
The “fracturing” of Mizruchi’s title was a ways off, however. Seeing themselves as embattled, “they mounted a counteroffensive, a full-scale mobilization in which corporations, large and small, found an increasingly unified voice.”5 They attacked government regulation, in particular the recently created Environmental Protection Agency and the Occupational Safety and Health Administration. They accused unions of hobbling American business’s ability to compete, and set out to destroy them, once and for all. And they succeeded. The election of Ronald Reagan in 1980 only solidified their victory, both in terms of unions (Reagan’s firing of the nation’s striking air traffic controllers in August 1981 is widely viewed as signaling the end of the American labor movement) and regulations (instead of the dangerous political move of trying to throw out regulations entirely, Reagan simply appointed critics of regulation to head the major agencies).
The Golden Passport Page 44