The United States had followed Britain into the Middle Eastern oil fields in 1924, with the American share of oil pumped from the desert growing steadily from one sixth of the total in 1936 to more than half by the end of World War II. Most of the crude came from Saudi Arabia, but there was oil everywhere, leading the head of the U.S. Petroleum Commission to refer to the Middle East as the “center of gravity of world oil production.”8 America’s dependence on foreign oil or, better phrased, Detroit’s dependence, became more acute during the next round of Arab-Jewish conflicts in 1956 with the Suez Crisis. As with the two wars that followed it, this struggle only marked a continuation of the unresolved hostilities lingering from the 1947–48 Arab-Israeli war.
In May 1967, Egypt and her Arab allies were ready to try again to evict Israel. After ejecting the UN peacekeepers, Egyptian President Abdel Gamal Nasser stated on Cairo Radio that the offensive was to be a “mortal blow of annihilation,” and it was his intention to “wipe Israel off the map.”9 Again, Israel did not wait to be overwhelmed by the forces concentrating on her borders, but, as in 1956, launched air strikes that decimated Arab air power, thus exposing invading tank columns to merciless bombardment from the skies. The war lasted only six days, allowing Israel to capture the Golan Heights near Galilee, the Sinai, and the West Bank—all parts of the biblical boundaries promised the Jews. Yet even after smashing Arab military forces and unifying Jerusalem under Jewish control for the first time since a.d. 70, the Israeli victory did not produce a lasting peace.
Nasser’s successor, Anwar Sadat, quickly rebuilt the Egyptian military and organized yet another invasion in 1973. Striking on Yom Kippur, the holiest of Jewish religious observance days, Sadat’s well-coordinated attack this time attained success. Supplied by the Soviets, the Arab armies offset the Israeli advantages in the skies, knocking out 20 percent of Israel’s planes in four days. Tel Aviv was in panic. Israeli (and American-born) Prime Minister Golda Meir persuaded Nixon to provide an emergency airlift of arms, which provided the logistics for a Jewish counterattack after days of absorbing blows. America’s airlift was reminiscent of the Berlin effort, covering 6,400 miles per day and delivering thousands of tons of supplies. In his darkest hour Nixon, alone, virtually saved Israel.
Within a short period the Israeli counterattack routed the Arab armies. The Soviets, responding to Sadat’s request for aid, threatened to enter the conflict. By that time, Nixon was spending virtually all his time combating the Watergate investigators and editing tapes. He deployed Kissinger to respond to the Soviet threat; and in an unprecedented occurrence, Kissinger, not Nixon, issued the orders for American military units to go on full-scale, worldwide alert—the first time since JFK had issued similar orders in 1962. Watergate had by then engulfed and distracted the president to the point where he could not conduct international negotiations.
The combatants brokered another truce, but this time the United States paid a heavy price for its involvement. In October 1973 the Organization of Petroleum Exporting Countries (OPEC), which represented virtually all of the Muslim oil-producing nations, cut production of oil and boosted prices 70 percent. Two months later OPEC jacked up prices another 128 percent. Gasoline and home heating oil prices in the United States soared, to which Nixon, the born-again Keynesian, responded by imposing a price ceiling on oil. This made gasoline artificially cheap. Americans knew what gas was really worth, but the government had just encouraged them to act as though it cost a fraction of what it really did. Demand shot up; supply remained low, but prices were fixed. Available stores of oil disappeared as fast as people could pump the cheap gas into their cars. An incredible spectacle ensued of car lines at gas stations—sometimes reaching completely around city blocks—to purchase the scarce, yet still cheap, fuel. State governments instituted an array of gimmicks to limit consumption, including setting up an even-odd license-plate-number purchase system. None of it worked. The wealthiest nation in the world slowed to a crawl because the government had artificially lowered fuel prices.
The high cost of energy, in turn, sent price shocks throughout industry and commerce. Sugar, which cost 13 cents in 1970, had risen to 33 cents by 1974. Flour, at 11 cents per pound in 1970, had nearly doubled to 20 cents four years later. Steaks that had cost $1.54 per pound rose to $2.12 per pound in 1974. Stereos priced at $119 in 1970 rose to more than $154 in 1974, and a man’s shirt rose more than 30 percent in price.10 All consumers, not just drivers, were hurt by the government’s flawed policies.
That did not stop Congress from getting into the act and setting the speed limit on federal highways at 55 miles per hour, a pace that was unrealistic for the vastness of places like Texas or Nevada, and one that often caused auto engines to operate at less than peak efficiency (thus burning more gas!). At the same time, Congress also established fuel-efficiency standards for all new cars, known as the CAFE regulations, which conflicted directly with the environmental pollution restrictions that lawmakers had earlier placed on automakers. Auto manufacturers responded by adopting lighter materials, such as aluminum and fiberglass, which proved less safe in the event of accidents. Gas mileage went up, but so did highway fatalities, even at slower speeds.
The blizzard of rules and regulations only displayed the utter incapability of the government to manage market-related issues. For example, to make cars safer, in 1966 the government implemented the National Traffic Motor Vehicle Safety Act—partly in response to Ralph Nader’s Unsafe at Any Speed—requiring automakers to install as standard equipment seat belts, impact-absorbing steering columns, and padded dashboards.11 These additions all drove up weight, which reduced gas mileage! Rather than trust consumers at the showroom to determine which “values” or characteristics were most important to them (good fuel economy, safety, or environment-friendly features), the government had created battling bureaus, each with its own budget justifications, lobbying against each other for funding.
In 1970 the Environmental Protection Agency decreed that auto companies had to reduce emissions to 10 percent of existing levels by 1976. Language contained in this bill, however, only affected autos made after 1970, by which time emissions in new cars had already been reduced from 70 to 80 percent. Thus, the new law required the existing autos to further reduce emissions (on the remaining 20 to 30 percent) by 90 percent, to a total emissions percentage of 3 percent! Achieving the last few percent of performance of anything is nearly impossible, and the man who coined the term “smog” referred to the EPA air quality standards as “absurd.”12
Detroit had no choice but to comply. GM introduced the catalytic converter, which only worked with unleaded gasoline (the tetraethyl lead had been removed). This, in turn, entailed a sizable drop-off in power, but made for longer-lasting engines. To compensate for the power drop-off, automakers looked to professional racing for ideas, souping up engines with overhead camshafts, turbochargers, fuel injection, and a host of other high-performance equipment that soon became standard on passenger cars. But the entire process of regulating, then powering up, was the equivalent of adding a small anchor to a boat, then installing the extra engines to overcome the drag!
Fuel efficiency, of course, is desirable if the market determines that it is a more important use of scarce resources than, say, cost or safety. When the United States had limitless forests, colonists burned trees at record levels because they were cheap. Whale oil was only replaced as interior illumination when it got more expensive than John D. Rockefeller’s kerosene. Thus, when it came to the “oil crisis,” without the government interference, one fact is undeniable: in 1973 and 1974, American gasoline prices would have risen dramatically, and that, in turn, would have forced down auto use and/or caused consumers themselves to demand more fuel-efficient cars. Based on the evidence of history, these changes would have occurred sooner without Nixon’s actions.
Honey, I Shrank the Economy!
Even though prices were controlled at the pump, Americans felt the oil price hikes ripple through the enti
re economy, via the industrial sector which had no government protection. This drove up the cost of production, forcing layoffs, pushing the United States into a steep recession. Unemployment rose to 8.5 percent in 1975—a postwar high—and the gross domestic product (GDP) fell in both 1974 and 1975. The oil-related recession cost the United States as much as 5 percent of its prehike gross national product (GNP), or a level five times greater than the cumulative impact of the Navigation Acts that had “started” the American Revolution.
Great Society spending and layers of federal regulations made matters worse. Some states and cities suffered even more than the nation as a whole. New York’s liberal spending policies swung the city into bankruptcy, and the mayor asked for a federal bailout. New Yorkers had voted themselves into their problems, then looked to taxpayers from Colorado and Rhode Island to dig them out. President Ford, however, courageously promised to veto any federal bailout of the Big Apple. A local newspaper, with typical New York attitude, ran the headline, ford to new york: drop dead!13 In fact, Ford’s decision not to reward the city for financial malfeasance was exactly the appropriate constitutional response. Left to their own devices, New Yorkers worked their city back into the black.
In almost every area of American life, the federal government had already done quite enough. Whether through the EPA, OSHA, the Consumer Products Safety Commission, or myriad other new agencies, government at all levels had started to heap voluminous oppressive regulations on business. In 1975 alone, 177 proposed “new rules appeared, as did 2,865 proposed amendments, for a total of 10,656 new and proposed rules and amendments, most of which applied to nearly all firms.”14 According to one study, environmental regulations enacted prior to 1990 by themselves—not including the 1970 Clean Air Act, the single largest antipollution law—reduced the GNP by 2.5 percent.15 Activists such as Ralph Nader and the environmentalists expected the “evil” corporations to simply absorb the costs (never expressing concern about the average people who had invested in such businesses to provide funds for a new home or a college education for the children).
Companies, of course, did not just passively accept the new costly regulations. Instead, American business battled the government on three fronts, increasing spending for lobbyists in Congress, fighting the new rules in the judicial system and in the court of public opinion, and passing along the costs of the regulations to the consumers. Not surprisingly, the pages in the Federal Register, which contained these rules, ballooned from 10,286 in 1950 to 61,000 in 1978, and at the same time, the numbers of attorneys in the United States rose by 52 percent in a ten-year period. More important, district court cases grew 131 percent and U.S. appeals court civil cases, where product liability cases more likely occurred, exploded by 398 percent.16 Predictably, corporations spent more on their legal divisions, while spending on research and development—the lifeblood of new products—consistently fell. There simply was not enough money to fund both lawyers and scientists.17 Every dollar spent to influence a lawmaker or run consumer-friendly ads was a dollar not spent on developing better and safer products or reducing the costs of existing goods. By 1980, America had four times as many attorneys, per capita, as Germany and twenty times more per capita than Japan, both of which had surged ahead of the United States in productivity, the key indicator of real economic growth.18
Meanwhile, big business was working against itself by avoiding change and innovation the way dogs resist baths. Significantly, not one of the top fifty technological changes in twentieth-century America came from established leaders in the field.19 IBM did not create the personal computer, nor did the calculator giant of IBM’s day, the slide-rule company Keuffel, create the punch-card computer. Airplanes sprang from the minds of bicycle mechanics and word-processing programs from the scribblings of college dropouts. Cellular phones were not developed by AT&T or even the famous Bell Labs.20
Stability had served industry well when the United States passed some of the other fading economic powers, then easily perpetuated growth during the postwar decade when there was little competition. But then complacency set in. Once the Japanese and Germans reentered world markets, U.S. companies lacked the competitive edge that had served them well half a century earlier. Afraid of rapid change, corporations introduced only marginal improvements. Automakers for almost two decades thought that merely by tweaking a body style or introducing minor interior comforts they could compete with dramatic changes in actual auto design from Japan. Japanese carmakers had struggled for ten years to adapt their vehicles to American roads and to larger American physiques, so when oil prices suddenly placed greater value on smaller front-wheel-drive, fuel-efficient cars, Honda, Nissan, and Toyota were more than ready. To their discredit, American auto executives continued to denigrate foreign innovations. It took a bankrupt Chrysler Corporation, under Lee Iacocca—the brains behind the Ford Mustang—to shock Detroit out of its doldrums. “We were wrong,” he courageously announced in one of his televised ads for Chrysler.21
New industrial evangelists like Iacocca, even had they been in the majority, constituted only half the equation for turning American business around. Labor, led by the hardscrabble union bosses who had achieved great gains at tremendous cost in the 1950s and 1960s, still acted as though it spoke for a majority of working Americans. By the 1970s, however, the unions were losing members at precipitous rates. Trade unions had formed a critical part of the Democratic Party’s New Deal coalition, and the most important organizations—the AFL-CIO and the Teamsters—were able to demand exceptionally high wages for their members in the automobile, steel, and trucking industries. By 1970, a typical line worker in Detroit commanded $22 an hour, owned two cars, a boat, and a vacation home on a lake, or the equivalent of the earnings of a midlevel attorney in 2002. Miners and truckers, as well as those working in manufacturing jobs, had substantially higher incomes than many professionals and received better benefits than people in almost any income category.22 Unionized employees routinely made between $10,000 and $12,000 per year with overtime. New homes sold for about $23,000, meaning that a worker dedicating 30 percent of his income to a mortgage could own a house in six or seven years, which compared quite closely to a 1990s professional earning a $70,000 salary and supporting a $150,000 mortgage.23 Equally as valuable as cash, during the salad days of steadily increasing auto sales, auto and steel unions negotiated generous benefit and pension packages, adding to the real value of their contracts. Leaders such as George Meany of the AFL-CIO and Jimmy Hoffa of the Teamsters wielded tremendous influence, not only over the Democratic Party, but also over the nation’s economy.
“Big Steel” and the auto companies, of course, did not just absorb these expenses, but passed them on to consumers, which added to inflation. American manufactured products, especially textiles, steel, autos, and electronics, rose in price relative to foreign competition. In steel alone, the cost of labor was six times that of foreign competitors.24 Sometime in the early 1970s, prices exceeded the threshold that most consumers were willing to pay in order to remain loyal to American-made products, and buyers began to switch to foreign goods. Recapturing formerly loyal customers is twice as difficult as holding them. Japanese and European manufacturers, who were turning out lower-priced quality goods, gained millions of new American customers in the 1970s. For the first time, “Made in Japan” was not viewed as a sign of cheap, shoddy goods, but as a mark of quality. Foreign competitors increased their steel production by some 700 million net tons, and builders scrambled to replace expensive American steel with fiberglass, aluminum, plastics, ceramics, and concrete.
American steel companies took the biggest hit of all. The industry had seen its international market share fall 20 percent since the Korean War, when U.S. steelmakers claimed 60 percent of the world’s sales. Worse, only one new steel plant—a Bethlehem facility in Indiana—was constructed between 1950 and 1970. At the same time, Japan gave birth to fifty-three new integrated steel companies, most of them with brand-new efficient mi
lls, and Japanese assets in steel plants rose 23 percent between 1966 and 1972, compared to an investment in American plants of only 4 percent. The overall output of U.S.-made steel barely changed between 1948 and 1982, leading many steel executives to try to diversify their companies. Layoffs began with the expectation that they would be temporary. Then weeks stretched into months and into years. By 1980, it was clear that after years of sounding unheeded warnings, the wolf had finally come, and the industry would never return to its 1960s peak.25
This was the last gasp of organized union power in manufacturing America. From 35 percent of the American workforce in 1960, union membership entered a downward spiral, to 27 percent of the workforce in 1990. That did not tell the whole story, however, because the hard-core industrial unions had plunged even more sharply than the total, which was kept afloat only by the two largest unions in America, the National Education Association (NEA) and the American Federation of State, County, and Municipal Employees (AFSCME). By 1980, AFSCME had twice the membership of the United Steel Workers.26 Thus, it became eminently clear why organized labor had a commitment to a permanently large and growing government and to public schools: those employees—operating outside the free market—now represented unions’ only hope of long-term survival.
A Patriot's History of the United States: From Columbus's Great Discovery to the War on Terror Page 120