Book Read Free

The Glory and the Dream: A Narrative History of America, 1932-1972

Page 164

by William Manchester


  To skeptics of the new prosperity the Astrodome was a shrine of tastelessness and overconsumption. Its 53 private boxes, available for five-year leases, rented for $15,000 to $34,000 a season. Private clubrooms, for those bored with the game below, were outfitted with bars and TV and provided by decorators with such themes as “Tahitian Holiday,” or “The French Hunt.” Over fifty different uniforms were designed for Astrodome workers, depending on their tasks (ground crewmen wore space suits), and each was sent to school for three weeks to learn how to project the proper Astrodome image. The greatest spectacle in the stadium was a home run by the home team Astros. Fans never forgot it, and some visiting pitchers never recovered from it. The electric scoreboard went berserk. Rockets were launched and bombs exploded. Electric cattle wearing the American and Texas flags on their horns bucked wildly. Electric cowboys fired electronic bullets at them. An orchestra crashed through “The Eyes of Texas.”

  “When in doubt,” went an advertising slogan that year, “buy the proven product.” But the skeptics had no doubts. They recoiled from all mass merchandise. To them the marketplace was evil. The economic lessons they had learned in their youth had lost all relevance. “The essential mass problem had shifted,” Eric Goldman wrote. It was “less food, housing and clothing than how to live with a weirdly uneasy affluence.” Goldman described “a maldistribution no longer accepted by a significant section of the population.” Certainly that section was highly articulate. Its polemics against maldistribution could be found in any bookstore. A common theme in them was that mass culture led to the garrison-prison state.

  Some perceptive authors pointed out to intellectuals that they were not always logical. William H. Whyte Jr. remarked that it was “a retrograde point of view” which failed to recognize that a growth of Babbittry was “a consequence of making the benefits of our civilization available to more people.” Caroline Bird noted that “People who are usually compassionate sometimes fail to see how the full employment of the 1960s is responsible for what they experience as a general decline in morals, competence, courtesy, energy, and discipline.” John Kenneth Galbraith reminded liberal humanists that they could not pick and choose among technological advances. If they wanted some of them, they would have to put up with the rest. Nor could they accept the wonders of applied science and reject the special relationship between the state and industry which had made it possible: “It is open to every freeborn man to dislike this accommodation. But he must direct his attack to the cause. He must not ask that jet aircraft, nuclear power plants or even the modern automobile in its modern volume be produced by firms that are subject to unfixed prices and unmanaged demand. He must ask that they not be produced.” Galbraith remained an eloquent advocate of a higher quality of life, but he told fellow critics that before dismissing the digital society entirely they should reflect that it has brought them much that they enjoy and take for granted; for example, “in the absence of automatic transmission of [telephone] calls, it would require approximately the entire female working force of the country to handle current traffic.”

  That was in 1967. Galbraith did not anticipate that within four years his assumption that all operators should be female would become fighting words. Yesterday’s truism had become today’s heresy. This, perhaps, was the overriding fact about the impact of science and technology on the United States. Its changes were convulsive, overwhelming. That was one reason for the jarring aspect of American life in the 1960s. Occupational skills became obsolete so rapidly that career planning was difficult and sometimes impossible for the young. In 1967, for example, the chemical industry calculated that half its business came from the sale of products which hadn’t even existed ten years earlier. One of these, the contraceptive pill, played a crucial role in the dramatic revision of American female expectations. At the same time, life expectancy for white females was approaching 80 years (as against 75 years for white males). Science and technology was steadily altering the shape of the future. In 1968 Herman Kahn and the Hudson Institute issued a thousand-page study of what American life would be like in the year 2000. Their prediction was that by then the annual per capita income of Americans would be $7,500 a year. Seven-hour days and four-day weeks would be typical, the institute reported; so would thirteen-week annual vacations. With enjoyment replacing achievement as the goal of men and women, it appeared that the very reason for existence, and even existence itself, would be altered in ways which were now inconceivable.

  ***

  But anticipating the future was not necessary to grasp what the technological revolution had already done to the United States. A glance backward could be breathtaking. In the early 1930s, when the now gray and balding swing generation was just approaching its teens, the largest category of Americans untouched by progress was the farm bloc—over 30 million people. In those days they lacked knowledge of even the fundamentals of conservation, which was one of the reasons for the devastating dust storms. Without rural electrification, farmers read by lamplight. Without electric power the typical farm wife had to carry as much as 28 tons of water each year from the pump or spring. Her butter churn was operated by hand. She did her laundry in a zinc tub and preserved meat in a brine barrel. Her husband’s chores were even more backbreaking. After the morning milking he had two hours’ work with the horses before he could set about whatever he had planned for the day. Horses and mules were his major source of locomotion—there were over 20 million of them in the country—and when he went to town he drove over dirt roads. Later his life would be sentimentalized by those who had no idea what it had really been like. Some of the most arrant nonsense would be written about the farm kitchen, when in fact, as Clyde Brion Davis pointed out in The Age of Indiscretion, “most of the cooking was frying—and not even in deep fat. The traditional American farmer… was scrawny-necked, flatchested and potbellied from flatulent indigestion.”

  If the farmer’s son was still living on the land a generation later, his world was entirely different. Trees planted by the CCC held the soil firm; strip-cropping and contour plowing made for greater yields and sturdier crops. Fifteen billion dollars in farm machinery had ended the tyranny of sweat and drudgery, and the 65 million acres once set aside for raising animal fodder were now used for produce. The development of hybrid corn had increased the nation’s corn harvest 20 percent without boosting the acreage needed. Driven to abandon cotton by the boll weevil threat of the 1930s, southern farmers had learned to plant other crops—and had tripled their income. The new farmer in the new rural prosperity drove to market on macadam. And his wife, in a kitchen glittering with appliances, the brine barrel having been replaced by a commodious freezer, fed her family properly. Afternoons she had time to run into town herself. She went to the hairdresser regularly and wore the same synthetic fabrics as her city sisters instead of the gingham dresses and cotton stockings of her mother.

  City toil had been transformed, too. The proletariat was disappearing. It was Norbert Wiener who had observed, in Cybernetics; or, Control and Communication in the Animal and the Machine (1948), that “There is no rate of pay which is low enough to compete with the work of a steam shovel and an excavator.” Already in the first half of the twentieth century automation had cut the number of common laborers from 11 million to 6 million. Over the next thirteen years the country’s work force grew by ten million—to 70.6 million—but the number of laborers continued to dwindle. Blue-collar workers were a shrinking minority. During the Eisenhower years the automobile industry’s production force dropped by 172,000 while it turned out a half-million more cars each year. The stature of the once mighty trade unions diminished; machines can’t strike. Labor leaders became conservative, suspicious of progress, and in some cases allies of their old foes, the corporations. Meanwhile, less need for male muscle was opening vast areas of employment to the women now entering the labor force, and the trend grew as the objectives of work changed. Instead of making goods, workers were joining the expanding service, amusement,
and leisure industries. In the “new mass-consumption society,” George E. Mowry wrote, “the old equation of man confronting materials and making something new of them had been changed to man confronting man and persuading him to act.”

  One masculine stronghold did not change. That was the executive suite. The Hudson Institute held out no hope that business executives might look forward to working less and loafing more in the year 2000. They could not be spared; too much depended upon them. This was a switch from the Roosevelt years. Executive illustriousness had been predicted in James Burnham’s widely reviewed book of 1941, The Managerial Revolution, but Depression folklore had generally held bosses in contempt, and Depression novelists and dramatists had depicted them as knaves and fools. (Moviegoers may recall the character actor who portrayed this stock role most successfully. He was Edward Arnold.) Yet by the 1960s they were high in the saddle. To be sure, they had little in common with the piratical entrepreneurs of the past. “The Tycoon,” said Fortune, “is dead.” Time described the new businessmen as “the professional managers, the engineer-trained technicians” who “took over industrial societies so huge that the average owner”—stockholder—“seldom exercised more than theoretical control.” Typically, they ruled not individually and by fiat but by committee, pooling information and expertise in what were variously called executive groups, task forces, assault groups, or, in the modish egalitarian spirit, “working parties.” In The New Industrial State John Kenneth Galbraith called those who thus shared power the technostructure.

  Bright, well-educated, and highly motivated, the men of the technostructure suffered, ironically, in one area to which they gave great attention: public relations. The problem here was refractory and institutional. American industry had always deceived itself and others about its true nature. Professing faith in Herbert Spencer went with the job, like the key to the executive washroom and membership in the Republican party. Executives insisted upon the viability of the profit motive, even though their own careers frequently gave it the lie; they continued to drive themselves although taxes took huge bites of their salaries. The name of John Maynard Keynes was ritualistically hissed even as they defected from Barry Goldwater, who not only criticized Keynes but actually meant it. They encouraged stockholders to think possessively about their corporation, yet the influence of corporate investors, always minimal, had declined even further by the 1960s, and anyone attending their annual meetings could quickly perceive that the decisions made by individuals there depended upon the information which the technostructure chose to provide them.

  This masquerade had been noted by economists. Usually the duplicity had been dismissed as harmless. After all, political ethics were honored more often in the breach. The technostructural deceit was graver than it seemed, however. As the Johnson administration grew older with no resolution of the Vietnam conflict, American businessmen were astonished to find that demonstrators were turning on them and accusing them of committing monstrous crimes with products like napalm. They couldn’t understand it; didn’t these angry people know that management and government were natural antagonists, not co-conspirators? They believed that and thought it should be obvious to everyone. But of course it was untrue. The truth was that by the late 1960s the military-industrial complex which had alarmed Eisenhower at the opening of the decade had continued to grow until the United States had—there is no other name for it—a planned economy.

  In 1967 Jean-Jacques Servan-Schreiber startled U.S. readers with such blunt assertions as “Federal agencies have been collaborating with American corporations in developing advanced technology ever since the end of the war,” and “In the United States nearly every major industry gets a substantial amount of federal assistance.” It scarcely seemed credible. Roosevelt’s heirs were still entrenched in Washington, scorning the economic royalists and the moneychangers; presidents of the National Association of Manufacturers condemned Washington paternalism; speakers at the U.S. Chamber of Commerce continued to explain that the government, since it never made anything, was essentially parasitic, and that the key to all economic progress was the businessman who was prepared to risk his capital in hope of gain.

  This was perhaps true of the child’s lemonade stand then being acclaimed in full-page ads extolling free enterprise—though where the child would be without paternal subsidies was unexplained—but it had lost all applicability for the five hundred giant corporations which, by the 1960s, accounted for two-thirds of the nation’s industrial production. Where was the risk for the Rand Corporation, whose total budget was underwritten by the U.S. Air Force? What gamble did IBM run when it invested five billion dollars in the perfection of integrated circuits for its third generation of computers, knowing that the Pentagon stood behind every dollar? How could ITT’s work on miniaturized electronic devices be called speculation when NASA knew that a manned flight to the moon would be impossible without them? As technology became more sophisticated and the lead time required for new developments lengthened, firms which were asked to break new ground demanded long-term contracts. Industrial executives and government bureaucrats, sharing the same goals, drew up budgets and reached decisions together. If the finished products were useful in marketable wares, there was nothing to stop the executives from cleaning up. Often they did. Integrated circuits—microcircuits which eliminate a chain of linked electronic parts: transistors, resistors, condensers, and tubes—are an example. Huge space rockets could not get off the pad without them. They made Polaris missiles and the swing-wing F-111 fighter possible. Boeing SSTs required them. So did the European Concorde prototype; governments in Europe had not been so cooperative, and when the manufacturers there needed the microcircuits, they had to deal with the only three firms making them, all American: Fairchild, Texas Instruments, and Motorola. The devices, they found, were expensive.

  It would be wrong to suggest that the American taxpayer had been swindled in this process. The government was committed to space travel; the electronic computer had become indispensable to the machinery of national strategy; improved methods of transport were in the public interest; national prestige benefited. Indeed, Servan-Schreiber was lost in admiration for the ingenious Yankees: “Behind most of their recent innovations is a huge reservoir of federal funds that have financed the most profitable investment any people ever made for itself.” The byproducts of space research alone included tremendous and invaluable gains in understanding refractory metals and equipment for working in vacuums. Through federal guarantees of large outlays of capital, the Pentagon, NASA, the AEC and the Federal Aviation Administration made possible the creation of marvels which would otherwise have waited a generation. Between the invention of photography and the manufacture of cameras, 112 years passed, from 1727 to 1839. The gap was 56 years for the telephone and 35 for radio. By paying for technical development and assuring a market for the end result, Washington had cut the lag to six years for the atomic bomb, five years for the transistor, and three years for the integrated circuit. There is a case to be made against the process, but it is a case against progress. That many Americans would find it persuasive is doubtful.

  What was not possible, however, was to argue that industry had maintained its sovereignty—that it remained free to oppose decisions made in Washington. With the administration spending 15 billion dollars a year on research and development, as against 6 billion from business and private agencies, the presumption of domination by the government was inescapable. In 1929 federal, state, and municipal governments accounted for about 8 percent of all economic activity in the United States. By the 1960s the figure was between 20 and 25 percent, far exceeding that in India, a socialist country. The National Science Foundation reckoned that federal funds were paying for 90 percent of research in aviation and space travel, 65 percent in electrical and electronic devices, 42 percent in scientific instruments, 31 percent in machinery, 28 percent in metal alloys, 24 percent in automobiles, and 20 percent in chemicals. Washington was in a position to ho
ld the very survival of great corporations as a hostage. It never came to that, no one was that crude, the matter was never discussed. Nevertheless, big industry had surrendered a large measure of autonomy.

  In another time this circumstance would have concerned few Americans and aroused even fewer. Johnsonian prosperity was being enjoyed on all levels of society. Except in times of great distress the United States has rarely been troubled by protesters swarming in the streets and damning the government. Most people have a stake in the system; radical movements have been historically frustrated in their search for American recruits. But the Johnson years were witnessing another significant innovation. Since the war the nation had acquired an enormous student population. At the time of Pearl Harbor about 15 percent of Americans of college age were so enrolled. By the fall of 1965, 40 percent were—over five million youths between eighteen and twenty-one. Within four years the figure would be 6.7 million. Nearly a half-million bachelor degrees were now being awarded each year. More than 30 billion dollars was being spent annually on formal education. Going to class had, in fact, become the largest industry in the United States, making students the country’s biggest single interest group.

  In the rest of the population this was a source of pride—education had become almost a secular religion, the proposed cure for all social ills—but undergraduates were becoming discontented and restless. Their futures were clouded by the Vietnam War, which grew more hideous and frustrating every day. Their dissatisfaction with the prosecution of the conflict was encouraged by thousands of the nation’s 150,000 tenure professors—men shielded from external discipline, who could be removed from their chairs only by death or personal scandal. Finally, many students were troubled by the realization that much of society’s enthusiasm for higher education stemmed from its market value. Just as other federal programs enhanced technology by creating microcircuits, so the huge grants to education served to train future technicians, executives, and customers. Undergraduates found that after acquiring a healthy skepticism, a university’s greatest gift to them, they were expected to stifle it and become cogs in industrial and governmental bureaucracies. Millions of parents saw nothing wrong with that. Many of the children were beginning to take another view, however. They said to one another: “They are snowing us. They are burying us. We cannot put up with it any more. We’re going to overthrow it.”

 

‹ Prev