The Breaking Point
Page 10
3 Morgan, Tim, Life after Growth (Hampshire: Harriman House, 2013), 12.
4 Davidson, James, and Lord William Rees-Mogg, The Sovereign Individual, 18.
5 http://www.nbcnews.com/tech/innovation/new-improved-3-d-printer-makes-objects-faster-clip-n324651.
6 Hamilton, D. P., and Dean Tajahashi, “Scientists Are Battling Barriers in Microchip Advances,” Wall Street Journal, December 6, 1996, A1.
7 Drexler, K. Eric, Engines of Creation (Garden City, NY: Anchor/Doubleday, 1986), 76.
8 http://ideas.ted.com/26-ideas-from-the-future/.
9 Kurzweil, Ray, The Age of Spiritual Machines, 3.
10 Daniel, Caroline, “We’re Going to Overcome Aging,” Financial Times, April 12, 2015.
11 http://www.investopedia.com/terms/t/time-preference-theory-of-interest.asp#ixzz3Xsq4GdoS.
12 Mulligan, Robert F., “Property Rights and Time Preference,” http://paws.wcu/mulligan/www/propertyrights.htm.
13 Hoppe, Hans Hermann, “Time Preference, Government, and the Process of De-Civilization,” Journal des Economistes et des Etudes Humaines 5, no. 2/3 (1994), 350.
14 Brynjolfsson, Erik, and Andrew McAfee, The Second Machine Age (New York: W. W. Norton, 2014), 154.
15 Ibid., 151.
16 See http://www.futuretech.ox.ac.uk/news-release-oxford-martin-school-study-shows-nearly-half-us-jobs-could-be-risk-computerisation, September 18, 2013—14:3.
17 Tett, Gillian, “The Fed’s Productivity Predicament,” Financial Times, August 21, 2015.
18 Agüera y Arcas, Blaise, “The Future Is Coming: Six Ways It Will Change Everything,” http://ideas.ted.com/author/blaise-aguera-y-arcas/.
19 See Ovid, Metamorphoses, trans. Allen Mandelbaum (New York: Houghton Mifflin Harcourt, 1993).
20 Cohn, Norman, The Pursuit of the Millennium: Revolutionary Millenarians and Mystical Anarchist of the Middle Ages (Oxford: Oxford University Press, 1970).
21 http://www.ft.com/intl/cms/s/0/7328512e-bc10-11e4-b6ec-00144feab7de.html#axzz3VprEkdda.
Chapter Five
Squandering the Spoils of a “Good War”
In the past 11 years in particular we have seen lies, fraud, bogus statistics and Mickey Mouse bookkeeping. For good measure the powers behind government have thrown in the gutting of America’s industrial base by outsourcing and offshoring. As an extra temporary measure the Fed has bailed out the financial sectors in the US and Europe and continues to bail out the US Treasury . . . America is slowly discovering that the land of the free and home of the brave has become a corporatist fascist nightmare.
—Bob Chapman, “Economy Debased by Lies, Fraud, and Bogus Statistics,” The International Forecaster, August 6, 2011
The $800,000,000,000.00 Lie
In 1950, the US government tried to faithfully report increases in the cost of living. But that changed decades ago. Politicians of both parties realized that you and other citizens don’t have the time to verify the regular Consumer Price Index (CPI) reports. So they concluded that they could get away with grossly understating the downward trend in household income. They have done such a thorough job of disguising the falling purchasing power of the dollar that current outlays for Social Security, allegedly adjusted for inflation, are just half what they would be if consumer inflation were honestly measured.
Yes, politicians lie.
As part of their ongoing budget negotiations, both Obama and congressional Republicans have embraced proposals incorporating even bigger lies in calculating inflation adjustments for Social Security, with the explicit objective of trimming billions and trillions more from that program’s future costs.
Lies. Lies. Lies.
You may not realize that inflation in 2013 was running at an annual rate of 9.6 percent if measured as it was during the Carter administration. But when you look back over six decades you can easily see that the dollar has lost far more of its value than the official statistics suggest. You know that for a family to live a middle-class lifestyle today it would need three or four times $30,591.30—the supposedly inflation-adjusted average income from 1950.
In fact, among the crazy quilt of current federal poverty income levels, the 250 percent ceiling to qualify for “Silver Plan” subsidies count an income of $60,625—198 percent of the supposedly inflation adjusted median income from 1950—as impoverished for a family of four. A joke.1
Perhaps one of the reasons politicians were not too devious to honestly report economic statistics in 1950 is that the news then was good. The first three quarters of 1950 were all among the top five quarters for GDP expansion (annualized) during the whole history of the United States. Q1 growth for 1950 was the top ever recorded in the United States at 17.2 percent (annualized)—growth the Chinese would envy today. It slowed to a mere 12.7 percent in Q2 of 1950 before surging again to 16.6 percent in Q3. The United States was at the top of the world then.
Not incidentally, since fathers were economically relevant in 1950, they were present in the lives of their children. When I was a toddler in 1950 Washington, DC, 90 percent of American children lived in families with both parents present. It is a measure of the nation’s decline that in some sections of DC today, those proportions are reversed, as detailed by the Washington Times in Luke Rosiak’s December 2012 article “Fathers Disappear from Households across America.”2 Rosiak reported that in Southeast Washington, one in ten children live with both parents, while 84 percent live with only their mothers.
The comparison with 1950 is even starker when you consider that widows led almost a third of the female-headed households in 1950. Notwithstanding the almost incessant wars fought by the United States in recent decades, widows headed only 14 percent of female households in 2011.
America at the Summit of the World
America in 1950 was an industrial colossus. As Winston Churchill said of postwar America, “America at this moment stands at the summit of the world.”3
According to the International Organization of Motor Vehicle Manufacturers, the United States in 1950 produced 80 percent of the world’s automobiles, compared to about 6 percent today. The United States commanded a similarly outsized share in other spheres of economic production, with three-quarters or more of world output of machine tools, electronics, chemicals, airplanes, and computers. The United States was energy self-sufficient then. US oilfields in 1950 accounted for over 50 percent of world oil production. American industry produced more than twice the goods and services of all European industry combined. US per capita production was 60 percent above Germany’s, 70 percent above France’s, and 80 percent above the United Kingdom’s.
US GDP per capita was 4.52 times the average world GDP per capita in 1950. Total national output was more than three times that of the main rival of the United States—the late, not so great, Soviet Union.
How did we reach such an incredible “Great Prosperity,” as F. A. Hayek dubbed the “unique 25-year period” of the postwar boom? One of the fond conceits of Keynesian partisans is the fallacy that massive deficit spending during World War II—in which the United States spent money from an empty pocket—stimulated recovery from the lingering depression of the 1930s.
Not exactly.
Spoils of a “Good War”
World War II was a crucial factor in the mid-century prosperity of the United States, but outsized deficits didn’t lay the foundation for mid-century prosperity. It was the full-fledged destruction of capital in the rest of the advanced world that made US industry so profitable in 1950. In the terms defined by Peter Taylor in The Way the Modern World Works, World War II was a “good war” for the United States.
Europe and Japan had been bombed to smithereens.
Only the United States and Canada escaped from World War II unscathed. Europe and Japan ended the war with their industrial capacity badly damaged and even ruined. The extensive destruction of capital restored the profitability of American business for a reason spelled out by Adam Smith in book 1, chapter 9 of The Wealth of Nations. As
Smith explained, when many merchants compete in selling the same good, “their mutual competition naturally tends to lower its profit.” The reciprocal of that is that when many competitors go out of business, the rate of profit for the remainder tends to go up.
Keynes’s Recipe for Combating Deflation without Adjustment
Keynes saw the crisis of the 1930s as one of underconsumption.
His theory argued for increasing aggregate effective demand through budget deficits and sought to explain depression on the basis of slack demand, with the stipulation that wages were “sticky.” But he offered no explanation of why wages happened to “stick” at one level rather than another.
Politically, of course, Keynes’s argument comfortably fit the requirements of a stagnant, regulated economy. This harks back to the regulation of capitalist commerce in the thirteenth and fourteenth centuries, as characterized by Pirenne in The Stages in the Social History of Capitalism. He wrote of the “most characteristic provisions” of statutes to “fix wages and regulate the conditions of work,” and that they were “inspired by the desire to prevent operations that will unfavorably affect prices and the workman’s wages.”4
The great political appeal of Keynes’s formula is obvious. He prescribed combating overproduction and underconsumption with the least possible inconvenience to either producers or workers. His recipe combatted deflation without adjustment.
That said, essentially every economist agrees that more competition reduces profits. And of course, when profits fall, this puts downward pressure on wages and employment.
“The Misdirection of Production”
The politically incorrect Austrian business cycle theory (ABCT) suggests that the crisis of profitability is accelerated and accentuated by central bank manipulation of interest rates. Unlike most theories of the effects of credit expansion on prices and output, the ABCT was not so much concerned with the effects of the total money supply on the price level or aggregate output and investment.
The Austrian economists, such as Ludwig von Mises and Friedrich von Hayek, identified business cycle bubbles as consequences of malinvestments, stimulated when credit expansion by central banks distorts intertemporal coordination of resources between capital and consumer goods (Hayek called it the “misdirection of investment”). This lengthens the structure of production beyond what can be supported by the underlying resource pool, or accumulation, given the time preferences of the various participants in the economy.
In the Austrian view, there is too little savings to support the lengthened structure of production. Time preferences are too short. People generally want to consume rather than save. As Hayek put it, this “can only lead to a much more severe crisis as soon as the credit expansion comes to an end.”
Toward the end of his long life, Hayek fretted that the “Great Prosperity” had been exaggerated by the elimination of factors such as the gold standard and fixed rates of exchange. He believed an expansion of credit and open inflation created full, and even excessive, employment. Hayek regretted political efforts to prevent the coming of a depression in the 1970s, fearing that they would lead to an even worse eventual breakdown.
Bear this in mind as we go forward. It is crucial to understanding how politicians destroyed US prosperity and why their continued attempts to inflate bubbles only ensure more stagnation and ultimate collapse. There will be no new sustainable boom emerging until there is enough creative destruction of capital to restore profitability in a new surge of innovation.
More on “The Great Prosperity”
In 1950, this seemed far away. All the European allies owed vast amounts to the United States, which had emerged from the war with two-thirds of the world’s gold reserves. The US dollar had become the world’s reserve currency, and until the late ’50s, it was the only fully convertible major currency not subject to exchange controls. In those days, the United States provided 85 percent of the world’s direct foreign investment. And American management and marketing techniques became dominant practice in the other advanced economies.
So vast was the US lead in production, management, and marketing that competitors fretted that Americans would put everyone else out of business. This view was exemplified by French analyst J. J. Servan-Schreiber’s 1968 international best seller, The American Challenge. He argued that the growing gap between American industry and the rest of the world posed problems that could lead to catastrophe.5
Of course, it turned out that Servan-Schreiber was quite wrong in imagining that the United States had mastered the challenge of universal education for continued prosperity. Nor did it turn out to be true that the United States’ lead in productivity growth would continue to compound over the remaining years of the twentieth century causing a reduction in the workweek, due to automation. Servan-Schreiber expected that America would become a postindustrial society by the late ’90s, with four seven-hour workdays per week, 39 workweeks per year, and 13 weeks of vacation. With weekends and holidays, that would have resulted in only 147 workdays a year.6
Sounds like fun. But it didn’t work out that way.
Far from enjoying a life of leisure, Americans now work longer and take multiple jobs to make ends meet. As reported in the Financial Times in March 2015, the average American works 85,000 hours in a lifetime, 70 percent more—35,000 additional hours—compared to the average Finn and 15,000 hours more than the average German.
In other words, Americans put in the equivalent of 32 years’ worth of additional postindustrial 28-hour workweeks than imagined by Servan-Schreiber. Outside of Asia, Americans work longer than anyone else. Even the great majority of farm operators in America are now also employed in outside jobs.
As reported by C. E. Clark in a September 2015 HubPages article—“Working 2 or More Jobs—Is This the New Normal?”—due to increased prices for goods and services and decreased salaries and wages, high-income earners are increasingly obliged to work multiple jobs in twenty-first-century America just to maintain their lifestyle.7
The United States was far and away the world’s greatest creditor in 1950. Today, we have become the most indebted country in the history of the world. In 1950, the US national debt was about $257 billion. In the space of sixty-five years, the US national debt multiplied an astonishing seventyfold in nominal terms to rival the Death Star at $18.15 trillion.
In 1950, the average American earned $3,210 a year, when a dollar was worth much more than it is now. According to clearly corrupt government inflation data, a 1950 dollar is worth $9.53 today. The average cost of a new house then was $8,450 ($80,528 today according to the pretend government inflation statistics). In June 2013, according to the Census Bureau, the average sale price of a new home in the United States for June 2013 was $249,700.8 The average family spent just 22 percent of its income on housing in 1950—50 percent less than today. The average cost of a new car in 1950 was $1,510 ($14,390 today if the official inflation adjustment were accurate). By contrast, the average transaction price for a new car in May 2013 was $30,978.
Another major difference between 1950 and today is that Americans used to live within their means. We were paid twice as much as Europeans with similar credentials so it wasn’t a great hardship to do without credit cards. The first commercial credit card, the invention of Diner’s Club founder Frank McNamara, was issued in 1950. American Express and Visa did not come along until eight years later, and Master Card began business in 1966. The original cardboard Diner’s Club card was honored in twenty-seven restaurants in New York City. By 1951, there were 20,000 Diner’s Club cards in use, or about one for every 7,500 Americans.
While the population of the United States has little more than doubled since 1950, the number of credit cards in circulation has gone up by about 30,500 times. US cardholders now have 609.8 million credit cards outstanding. Based on May 2013 statistics from the Federal Reserve, total US credit card debt was $856.5 billion, with the average household owing $15,325.
According to the Bureau
of Labor Statistics, the average American, with or without the Diner’s Club card, consumed 3,260 calories a day in 1950, about 600 calories more than the average man eats today, but not one of the 1950 vintage calories was composed of high fructose corn sweetener, an artificial food—and the largest source of calories in today’s American diet—that was not invented until 1957.
Equally, while consumption of artificial trans fats was growing with the spreading popularity of margarine, most families in 1950 still used butter. Children in the 1950s vintage peanut gallery drank three times more milk than soda—the opposite of today, as children now drink three cups of soda for every cup of milk. And the milk they drank in 1950 was natural whole-fat milk—not the sugary no-fat concoction of lactose (milk sugars) that so many children imbibe today, which is a contributing factor to the fact that almost 19 percent of American kids, age six to eleven, are obese.
Yes, there were fat people in the 1950s. (Some of the maids employed by my mother and my aunt were real butterballs.) Yet as a child in that era myself, I can’t remember ever seeing a fat kid—until I reached high school in the 1960s.
In that long-lost and innocent world, when no one worried about the Kardashians and their stretch marks, children may not have been precocious enough to calculate economic indicators, but they were economic indicators in themselves. The Baby Boom was a sign of good times.