Trillion Dollar Economists_How Economists and Their Ideas have Transformed Business

Home > Other > Trillion Dollar Economists_How Economists and Their Ideas have Transformed Business > Page 13
Trillion Dollar Economists_How Economists and Their Ideas have Transformed Business Page 13

by Robert Litan


  The Business of Forecasting

  The business of economic forecasting as we know it today has its roots in the Keynesian revolution during the Great Depression and World War II. One of the pioneers of econometrics and macroeconomic forecasting was a Dutch economist, Jan Tinbergen, who is credited with developing the first national comprehensive model of how an entire economy works. Having done it first for his home country, the Netherlands, in 1936, Tinbergen produced a model of the American economy in 1938 for the League of Nations in Geneva, Switzerland.2 Tinbergen’s initial model is considered a precursor to the large forecasting models solved by computers today. In 1969, Tinbergen and Norwegian economist Ragnar Frisch shared the very first Nobel Prize awarded in economics, “for having developed and applied dynamic models for the analysis of economic processes.”3

  American economist Lawrence Klein created the first large comprehensive forecasting model of the U.S. economy (see box that follows). Having studied for his PhD under Paul Samuelson at MIT during World War II, Klein built a more robust version of Tinbergen’s earlier model in order to estimate the impact of the government’s policies on the U.S. economy.4

  Lawrence Klein

  Lawrence Klein is widely acknowledged to be the father of macroeconomic econometric models. Born in Omaha, Nebraska, in 1920, Klein grew up during the Great Depression and went to the University of California, Berkeley for college, where he studied economics and mathematics. He earned his PhD at MIT and thereafter joined the econometrics team at the Cowles Commission of the University of Chicago (now the Cowles Foundation), where he began the challenging task of, in his words, “reviving Jan Tinbergen’s early attempts at econometric model building for the United States.”6

  During the late 1940s and 1950s, Klein traveled around the world and performed pioneering research in econometrics and macroeconomic modeling. Shortly after World War II, Klein used his model of the U.S. economy to correctly predict—against conventional wisdom—an economic expansion rather than another depression.7 At the University of Michigan, Klein worked with a graduate student, Arthur Goldberger, to develop what later became known as the Klein-Goldberger Model, an early U.S. macroeconomic forecasting model. Klein left Michigan because the university denied him tenure due to his earlier temporary post-war affiliation with the Communist Party, which he joined simply to speak at a particular event (a decision he had long renounced, to no avail).

  Klein then moved to Oxford for a short period before being invited in 1958 to join the faculty of the University of Pennsylvania. He promptly began work on a series of models that became known as the Wharton Models, which ultimately contained thousands of equations solved by computers to forecast a wide range of macroeconomic variables. The following year, he was awarded the John Bates Clark Medal for his pioneering work in macroeconomic modeling. The University of Pennsylvania was his academic home for the rest of his life.

  Beginning in the early 1960s, Klein began consulting on forecasting for both private and public sector clients around the world. In the late 1960s, Klein played a critical part in initiating and leading Project LINK, a large and ambitious research project aimed at coordinating econometric models in different countries.8

  Klein founded Wharton Econometric Forecasting Associates as a nonprofit organization within the University of Pennsylvania in the late 1960s, which was later sold to a private publishing company. Over the years, WEFA became one of the world’s leading forecasting organizations, and Klein remained engaged in special projects even after the firm merged with Data Resources Inc. (DRI), WEFA’s main competitor, and formed Global Insight in 2001.9

  Klein was awarded the Nobel Prize in Economics in 1980 “for the creation of economic models and their application to the analysis of economic fluctuations and economic policies.”10 Klein lived to the age of 93 and died at his home in 2013.

  Finally, one personal note: I will be forever grateful to Professor Klein for recommending me for my first job at the Brookings Institution after completing my undergraduate studies at the University of Pennsylvania. At Brookings, I had the great privilege of serving as the research assistant for Arthur Okun, another of the great economists of the latter half of the twentieth century, who at 39 became the youngest chair of President Johnson’s Council of Economic Advisers (and who tragically died at the young age of 51 of a heart attack). My career would have never been the same without Klein’s kind and extremely generous gesture to a (then) young college student.

  By the 1960s, Klein was the undisputed star in the field of forecasting. Through a nonprofit organization set up within the University of Pennsylvania known as Wharton Econometric Forecasting Associates (WEFA), he would regularly perform and sell forecasts to both the private sector and governments around the world.5

  During roughly this same period, Klein and other economists at the University of Pennsylvania collaborated with Franco Modigliani (another future Nobel Prize winner) and his colleagues at MIT, and with economists at the Federal Reserve Board to build the Penn–MIT–Fed macroeconomic model of the economy. That model has been successively refined through the years, but it is still the workhorse of the Fed staff in preparing their forecasts for the meetings of the Federal Open Market Committee, which sets monetary policy and conducts other business of the Fed.

  Klein and WEFA’s foray into econometric forecasting attracted other entrants. Among the more notable, and for a time the most successful, was Data Resources Inc., founded by the late Harvard economist Otto Eckstein and Donald Marron, a former CEO of Paine Webber (a brokerage firm bought by UBS bank in 2000). During the 1970s and 1980s, DRI and WEFA were the dominant macroeconomic forecasting firms, projecting not only the outlook for the entire economy but for specific industries. Both firms also provided one-off studies of particular subjects using their econometric engines—their large bodies of equations, based on the use of regression analysis and historical data on multiple variables.

  Through much of this period it seemed as if the macro models had unlimited futures but then, as in other industries, disruptive technologies combined to make the macro forecasting business, as a commercial operation, much less profitable. One of these technologies was older and in use for some time before it helped seal the fate of the large macro models, namely software for regression analysis and other statistical techniques that individual users could use on their own mainframe and later minicomputers. Robert Hall is one of the nation’s leading economists; he has long taught at Stanford and at this writing is the head of the committee of the National Bureau of Economic Research that pinpoints the dates at which expansions end (recessions) and later begin. He also developed one of the most popular programs of this genre, TSP (Time Series Program). Hall did this while he was a graduate student in economics at MIT.

  Hall’s and subsequent versions of TSP refined by Berkeley’s Bronwyn Hall were important, but it was a hardware innovation—the personal computer—combined with the statistical software packages then available that really disrupted the macro modelers. Armed with a PC, a statistics software app, and some data, virtually anyone with enough training could build his or her own, much smaller models without paying substantial annual sums to the macro modelers for either macro or industry-specific (micro) forecasts. And that is precisely what many customers of the macro modelers eventually did.

  Macro-model customers moved away from the models for other reasons as well. For one thing, they were so large, with so many equations, that they were not transparent. Users couldn’t easily understand how a change in one or more of the input variables translated into changes in projected outputs. They simply had to trust the model, or the modeler, since it was also unclear how often and to what extent those running the models adjusted their forecasts with judgmental factors that reflected the modelers’ own beliefs about whether to trust the unadjusted projections of their models.

  Another contributing reason for the decline in macro modeling was the so-called Lucas critique, outlined by eventual Nobel Prize
winner Robert Lucas of the University of Chicago. Lucas demonstrated that fiscal and monetary policies were influenced by some of the factors driving the forecasts of macro models, so one could not draw reliable conclusions about the impacts of certain policy changes by using the models. In technical terms, Lucas showed that fiscal and monetary policies were not truly independent variables.

  Another problem that has plagued not only the macro models but also users of regression analysis is how to distinguish between causation and correlation. Two or more variables may be highly correlated with the variable to be projected, say GDP, but it may not be clear they cause or determine GDP. Although Clive Granger developed a statistical method for addressing this problem—an achievement that earned him a Nobel—the macro models did not correct all of their equations for it.

  Yet another challenge to the macro models was posed by the rise of VAR models (technically vector autoregression models) that were statistically fancy ways of just extrapolating past data into the future. VAR models often outperformed the structural macro models. One of the leading exponents of VAR models is another Nobel Prize winner, Christopher Sims of Princeton University. Both VAR and the macro models had difficulty predicting turning points in the economy, or the beginnings of recessions or expansions.

  The decline of the large-scale macro-model business has not ended forecasting, however. Numerous forecasters, on their own or working mostly for financial companies, offer forecasts built with PCs and off-the-shelf software and are routinely surveyed by popular news outlets such as the Wall Street Journal. At the same time, several large-scale commercial models remain (Moody’s, Macroeconomic Advisers, and IHS, which also bought Global Insight). The Fed and the International Monetary Fund, among other official entities, continue to use their own large-scale macro models.

  Many businesses and other governmental organizations—notably the Congressional Budget Office and the Council of Economic Advisers—use an average of the major forecasts of key macroeconomic variables, such as GDP growth, inflation, and unemployment, compiled by Blue Chip Economic Indicators. This approach adapts the wisdom-of-crowds approach (really the wisdom of experts) to forecasting, which has been widely popularized by the journalist James Surowiecki of the New Yorker.11

  The Business of Economic Consulting

  In the 1970s the tools of economics, and particularly econometrics, began to be widely applied to solve real-world business and legal challenges by new firms in what is now known as the economic consulting industry.

  Whereas the business of forecasting involves primarily macroeconomic models dealing with the national economy, the business of economic consulting involves the application of microeconomic tools to the challenges that individuals and firms face, rather than whole economies. In particular, the economic consulting firms formalized the business of providing economic expertise in an expanding array of legal disputes, addressing such questions as causation, valuation, and damages. Today, economists from various consulting firms are routinely used as experts, on both sides, in legal disputes involving antitrust, patent, discrimination, and torts (personal injuries) issues, among others. In addition, economists are frequently found in various regulatory proceedings at all levels of government.

  The rise of economic consulting also coincided with the development and growth of the field of law and economics, taught in both law schools and economics departments. One of the fathers of law and economics, Richard Posner, has had a tremendous influence on the way many judges analyze and decide cases. Posner, a law professor with economic training who later was appointed to be a federal circuit judge, is widely regarded as the most prolific legal scholar and judge of his generation. In 1977, he cofounded, with his University of Chicago Law School colleague William Landes, Lexecon, which became one of the more successful economic consulting firms. Lexecon is now part of the global firm FTI Consulting.12 Other successful competitors in the economic consulting business include Analysis Group; The Brattle Group; Cornerstone; CRA International; Economists, Inc.; Navigant; and National Economic Research Associates, or NERA. (Full disclosure: during the course of my career I have had a part-time relationship, as many economists do, with several of these firms.)

  The growth of the economic consulting industry as we know it today would not have been possible without the technological revolution of the past thirty years. In particular, many of the innovative tools and methods used in economic consulting, such as regression analysis, depend on the use of advanced computers, network services, and software to store and analyze large quantities of data.

  The contribution of the economic consulting industry to the economy should be put into some perspective, however. Since the litigation-consulting component of the business consulting industry is linked to specific disputes, the economists who participate in these matters assist in transferring wealth from one pocket to another, which may outweigh any enhancements to the productive efficiency of the economy to the extent that the quantification of damages assists the legal system to deter undesirable behavior. The latter impacts encourage resources to move to more productive activities. Whatever the net impacts of litigation consulting may be, it is uncontestible that economic consultants could not do their jobs and the audiences they address—judges, regulators, and sometimes legislators—could not interpret the consultants’ work without relying on the methods of analyses developed by academic economists and statisticians.

  Franklin Fisher

  While Lawrence Klein and Otto Eckstein were showing the real-world application of large-scale regression analyses, another econometrician, Franklin Fisher, now emeritus professor of economics at MIT, was publishing papers on econometric theory. Fisher’s research and econometric methods have been widely used by empirical economists for decades, both in academia and in the economic consulting business.13

  Fisher did his undergraduate work at Harvard, where initially he did not know where and how to apply his prodigious mathematical skills. He roamed around the Harvard course offerings until his section leader in an introductory history course suggested that Fisher try economics. He followed the advice and was smitten.

  One of his early undergraduate essays in economics was brought to the attention of Merton (Joe) Peck, a leading expert in industrial organization at Harvard (who coincidentally a number of years later taught me at Yale and was one of my PhD thesis advisers). Peck forwarded Fisher’s work to Carl Kaysen, another leading industrial organization expert also at Harvard. Kaysen also had an unusual experience for an economist: In the 1950s he spent a year as an economic clerk for Judge Wyzanski in the famous antitrust case brought by the Justice Department against the United Shoe Company.

  In any event, Kaysen was so taken with Fisher’s paper that he took the unusual step of becoming Fisher’s tutor, while also helping him be immediately admitted into graduate level courses. Somewhat like my mother who questioned the value of an advanced economics degree—because that was where Fisher was clearly headed—Fisher’s mother came to see Kaysen around this time to express her skepticism, but apparently was assuaged.

  Fisher became fascinated with econometrics in particular by working on an empirical research project with one of his professors, and then went to graduate school at MIT, where he taught his entire career after earning his PhD at the school. Fisher published some of the leading theoretical papers in econometrics in the 1960s, but later turned to more practical empirical uses of econometric tools to understand particular problems once he realized that theoretical econometrics was moving in the direction of pure mathematics. Fisher’s interest in econometrics, like that of other economists, was more practical.

  One demonstration of this bent was that Fisher became one of the early outside directors of the economic consulting firm Cambridge Research Associates (CRA), a firm formed by John Meyer and several colleagues. Fisher has remained affiliated with CRA (and MIT) ever since.

  Data Analytics and Big Data

  Earlier I referred to the practice of
data mining in the pejorative sense in which it was used during much of my career. My, things have changed. With the rise of the Internet, mobile telephones, and the proliferation of various kinds of databases, both public and private, the term now has both negative and positive connotations, but very different from those associated with those just running regressions. The negative associations overwhelmingly reflect concerns about intrusions of personal privacy by the government or private companies. The positive aspects of data mining, now associated with Big Data, relate to the ability of analysts to uncover patterns in very large data sets in a short period of time that can lead to new drugs and other products, new services, and new ways of producing or delivering them.

  It is not my purpose here to debate the pros and cons of mining Big Data and how to limit its downsides, but rather simply to point out that the analytical techniques used for extracting useful information from large data sets include (but are not limited to) regression analysis in its various forms. These techniques are used in businesses to analyze customer behavior in the real world and on the Internet; pharmaceutical companies looking for new cures; meteorologists looking to improve their forecasts; financial institutions seeking to improve their detection of fraud; and, as will be discussed in the next chapter, by firms conducting continuous experiments (often on the Internet) seeking to refine their product and service offerings to consumers. Expect more uses and benefits from Big Data as more firms, and even the government, devote more resources to data analytics.

  I am aware that the main practitioners of data mining are statisticians rather than economists. Indeed, Google, with a huge amount of data due to the vast number of searches conducted on its website each day, has many more statisticians than economists on its staff for analyzing data.14 Nonetheless, economists can be useful in structuring these analyses, highlighting what to look for, as well as in designing and interpreting the results of the experiments aimed at improving customer experiences. Businesses are increasingly recognizing this to be the case, and they are hiring economists in increasing numbers (after pretty much ignoring them in the preceding two decades).15

 

‹ Prev