Book Read Free

American Empire

Page 63

by Joshua Freeman


  At the end of the twentieth century, Americans lived on a scale of consumption and resource usage unprecedented in human history and unmatched elsewhere on the planet. No radical departures brought this about. With the resumption of economic expansion in the 1980s and sustained growth during the 1990s, development along lines already laid out remade the built environment and social landscape. Through incremental change, the United States became a very different society than it had been at the end of World War II, or even at the end of the Vietnam War. The cumulative effect of decisions made in the private sector and the public realm led it to a down-home imperial grandiosity, with implications not just for Americans but for the ecological status of the earth itself.

  Immigration

  The United States puffed up in part because of population growth. Between the end of the Vietnam War and the end of the century, the country’s population increased by over a quarter. The rate of population growth, which slowed after 1950, picked up between 1990 and 2000, when the number of residents swelled from 249 million to 281 million, making the United States the third most populous country after China and India. It lagged behind the world as a whole in its rate of population growth but exceeded by several-fold the rate for most industrialized countries, which typically had very low or even negative growth.

  During these years, the birth rate in the United States hovered near historic lows, rising modestly after hitting bottom in the mid-1970s. (It began to fall again during the 1990s.) Among major population groups, only Hispanic women had a fertility rate significantly above the replacement level. Non-Hispanic white women averaged only 1.8 children apiece. Rather than decreased sexual activity, the decline in fertility reflected increased use of contraception and the availability of legal abortion. In the late 1990s, the number of abortions equaled a third the number of live births. By gaining more control over their lives and bodies, women transformed the demography of the country. They married later, had children later, spaced births more widely, and stopped childbearing earlier.

  Immigration more than compensated for the low birth rate. From World War II to the end of the century, the number of immigrants coming to the United States through legal channels rose steadily: 1.0 million in the 1940s, 2.5 million in the 1950s, 3.3 million in the 1960s, 4.5 million in the 1970s, 7.3 million in the 1980s, and 9.0 million in the 1990s. In addition, millions more came illegally, with at least seven million undocumented foreign-born residents in the country in 2000. Immigration peaked in 1991, when the 1.8 million arrivals equaled .72 percent of the total population. In absolute terms, never before had so many immigrants entered the country (though the rate of immigration had been higher early in the century). The exceptionally large flow of immigrants accounted, to a large measure, for the exceptionally high rate of population growth in the United States compared to other industrialized nations.

  Increased immigration stemmed from broad global changes. Improved health in less developed countries contributed to rapid population growth, exceeding the capacity of local agriculture or urban job markets to absorb the ever larger generations of people seeking work. In many places the spread of market relations and the arrival of international capital disrupted traditional economic arrangements. So did civil wars and civil strife. The growing ubiquity of television, tape recorders, CD players, and other mass media brought knowledge of the wealth, economic opportunities, and culture of the United States to people around the world. With most rich countries imposing far more severe restrictions on immigration, it became the favored destination for emigrants, particularly from countries in its economic, military, and political orbits. According to the United Nations, between 1995 and 2000, more than half the people (net) moving from less developed regions of the world to more developed regions went to the United States.

  The 1965 immigration act made possible the growth of immigration and profoundly changed its nature. The elimination of national quotas opened the door for emigrants leaving poor countries around the world. Very quickly, arrivals from Asia and Latin America eclipsed arrivals from Europe, as the immigration stream became much more heterogeneous in national origin, occupation, and class status than ever before. In 1990, immigrants from Europe made up only 3 percent of the total inflow, down from 90 percent in 1900. Mexicans constituted nearly a quarter of immigrants during the 1980s and 1990s, while substantial numbers of Central Americans, Caribbeans, and South Americans arrived too. By the end of the 1990s, more Hispanics lived in the United States than African Americans, a break from the long historical pattern in which white descendants of European immigrants and black descendants of slaves constituted the largest population groups and to a great extent defined, separately and in their interaction, the dynamic of the country. Immigration from Asia trailed that from the Western Hemisphere, but it far exceeded the pre-1965 levels and included national groups that previously had been all but unrepresented: Indians, Pakistanis, Bangladeshis, Iranians, Vietnamese, Laotians, Cambodians, and Thais.

  Post-1965 immigrants clustered in a handful of gateway cities, including New York, Los Angeles, San Francisco, Miami, Chicago, Houston, and Washington, D.C. But over time, they also filtered into areas that had not seen substantial numbers of immigrants for generations, if ever. In 1980, Green Bay, Wisconsin, a tidy midwestern town, best known for its professional football team, the Green Bay Packers, had an almost entirely white, native-born population. But then Hispanic immigrants moved to the area to work in the meatpacking plants that gave the football team its name, while Hmong refugees, allies of the United States during the Vietnam War, were resettled in the city. In 2000, one out of ten Green Bay residents was Hispanic, Black, or Asian, a dramatic change from the extreme homogeneity of the past. In many towns and cities in the South—especially in Virginia, North Carolina, and Georgia—a region that historically had very low levels of immigration, substantial communities of Hispanics and other immigrants could be found by the late 1990s, drawn to jobs in meatpacking and food processing plants, textile mills, furniture factories, and other industrial enterprises. In 2000, in twenty-seven states 5 percent or more of the population was foreign born. (Nationally, foreign-born residents and children of foreign-born parents together constituted a fifth of the population.) Though immigrants continued to be heavily concentrated in urban, coastal areas, they had become a presence everywhere except in parts of the Deep South and the Great Plains states.

  During the nineteenth and early twentieth centuries, the overwhelming majority of newcomers arrived without skills relevant to an industrial economy. By contrast, the late-twentieth-century immigration surge included a substantial number of well-educated professionals, white-collar workers, and businesspeople, who came with skills and in some cases capital that allowed them to move directly into well-paid jobs and middle-class lives. Still, most immigrants, like their predecessors, did not have the skills, connections, or language facility to escape low-wage work in service, agricultural, or blue-collar jobs. In many cases they filled occupational niches abandoned by native-born workers as pay and conditions deteriorated with deunionization and business pressure to lower costs. In Massachusetts, after a spurt of immigration during the 1990s, foreign-born workers held 45 percent of the semiskilled blue-collar jobs and 27 percent of the service industry jobs. In much of the country, the basic labor of social reproduction, including childcare, eldercare, and cleaning, cooking, and maintenance in homes and hospitals, was largely performed by immigrant workers. In California, immigrants and sojourners, mostly from Latin America, made up more than 90 percent of the agricultural workforce. As business succeeded in downgrading pay and conditions for workers in the lower occupational strata, immigrant labor became utterly crucial to the economy and to sustaining the American way of life.

  Big Cities, Empty Plains

  As the population of the country increased, so did its density. In 1940, the country had thirty-seven people per square mile; in 2000, nearly eighty. That was well
below the global density of 120 people per square mile and far below the density of the major European powers (not counting Russia, whose vast land area brought its density down to just twenty-two people per square mile). Nevertheless, the unequal distribution of population made parts of the country feel downright crowded. In 1990, for the first time, more than half the population resided in metropolitan areas with more than one million people. And in 2000, over half of all Americans lived in just ten states. New Jersey, the most densely populated state, had 1,134 people per square mile.

  Regions once thought of as rural became crowded in parts. Between 1980 and 2000, the share of the population living in the West rose from 19 percent to 22 percent and in the South from 33 percent to 36 percent. Most western and southern growth took place in urban and suburban areas, not the thinly populated countryside, leading to extensive sprawl.

  Even as traffic jams and crowded schools testified to increased population density, some sections of the interior of the country depopulated, particularly a band running from the Mexican border just east of Big Bend, Texas, north through the Great Plains to the Canadian border. In 2000, over a century after the superintendent of the census declared that the country no longer had a continuous frontier, an area in its center as large as the Louisiana Purchase, nearly 900,000 square miles, met the nineteenth-century federal definition of “frontier,” two to six people per square mile. People had been moving out of the dry lands of the Great Plains for seventy years, as cattle raising and irrigated agriculture proved difficult or impossible to sustain. In some areas, large companies bought up and consolidated family operations, so that crop raising continued—aided by federal agricultural subsidies—but population dropped and the small towns that dotted the region became ghosts of what they once were, with stores, banks, churches, restaurants, and schools closing. In other areas, especially in the northern Great Plains, land reverted to prairie.

  As whites moved out of the Plains, Indians moved back in. The Native American populations of North and South Dakota, Montana, Nebraska, and Kansas rose between 1990 and 2000 by 12 to 23 percent. Indians returned to reservations from elsewhere in the country, attracted by life in communities of other Indians and jobs at the casinos that provided the main source of economic growth for many of the Great Plains tribes (though their reservations remained among the poorest places in the country). The spread of Indian-owned casinos was an offshoot of the increased militancy and tribal resurgence of the 1960s and 1970s, facilitated by a 1987 Supreme Court ruling that severely limited the power of states to regulate tribal gambling operations.

  Indian tribes played an important role in the return of the buffalo, which had been on the verge of extinction at the start of the twentieth century. Tribes built herds and systematically managed them. When the century ended, 300,000 bison roamed the Great Plains, as the movie of history seemed to be running backward. Euro-American dry lands agriculture had not panned out as homesteaders once assumed. Instead, in much of the region it proved to be a brief, unsuccessful interlude.

  Crime and punishment were among the few new sources of income in economically hard-pressed rural areas. Over the course of the 1990s, small-scale laboratories producing methamphetamine—a drug of choice for rural whites—popped up in small towns and rural backwaters across the country. With them came more drug use and a wave of rural crime. In some rural counties, the crime rate far exceeded urban norms. While crime brought in some money, so did incarceration. Many small communities found a source of jobs and money in attracting new prisons. Often they were built and operated by private companies and housed convicts from other states that, in an era of soaring incarceration rates, had run out of room in their own institutions.

  Suburbia

  Even as some parts of the countryside depopulated, others disappeared beneath suburban development. The combination of rising population, decreasing household size, and easy credit stimulated a gigantic residential building boom. Between 1990 and 2000 alone, fourteen million housing units were constructed. Home construction and sales became a major economic driving force, with real estate industry employment jumping from less than a million in 1980 to over a million and a half in 2000. (Women made up more than half the sales force.)

  The United States encompassed a wide variety of patterns of life, but by 2000 the suburb had become clearly dominant. That year, exactly one-half of all Americans lived in a suburb. (Thirty percent lived in a central city and the remainder in rural regions.) Jobs as well as people kept migrating to the outer parts of metropolitan regions, often quite far out along radiating highways. Ninety percent of new office space built during the 1990s was suburban, leaving Chicago and New York as the only major metropolitan regions with more office space in their central cities than their surrounding suburbs. Ever more people grew up taking suburban life for granted, as the only way of life they knew. Suburban sensibilities and physical forms became templates even for social institutions located in other settings, from enclosed central-city malls to sprawling, car-oriented universities.

  Culturally, from John Cheever to Father Knows Best to American Beauty (which won the 1999 Academy Award for Best Picture), suburbia was treated as the land of the white, middle-class nuclear family—paradisaical, dysfunctional, or somewhere in between. But by the end of the twentieth century, the reality was far more complex. In 2000, married couples with children made up only slightly more than a quarter of suburban households. When Levittown, New York, celebrated its fiftieth anniversary in 1997, a quarter of the homes were occupied by either single-parent families or mothers and grown daughters living together.

  In 2000, barely half of all households—suburban or otherwise—included a married couple, down from nearly three-quarters in 1960. A third of men and a quarter of women never married (including 42 percent of black women). As in most of Europe, more and more children were born to unmarried mothers, with the proportion of births outside of marriage reaching one-third at the century’s end. (African Americans had a much higher percentage of children out of wedlock than other groups, but starting in the early 1990s, unlike whites, they experienced a modest increase in the proportion of children born to married couples.) Even among married families, the Father Knows Best household of a male breadwinner, stay-at-home wife, and live-at-home children became the exception.

  Most families with children had no stay-at-home parent. Stagnating wages made it difficult for even two-parent families to maintain what was seen as a comfortable way of life without maximizing their time at work, one of the factors that led to an increase in the average number of hours Americans worked each year. From 1,905 hours in 1979, the average work year rose to 1,966 hours in 1998, a boost equivalent to an extra week and a half of work a year, which pushed the country past Japan to have the longest work year among the major industrial powers. Along with a higher percentage of adults in the workforce than in any other advanced industrial country, this kept U.S. per capita income the highest among industrial nations. Americans could afford what by world standards remained on average an exceptionally bountiful way of life by spending less and less time at home and more and more time at work. And getting to work. As metropolitan regions sprawled over ever larger areas, workers faced longer and longer commutes.

  Levittown, as it approached its half-century celebration, remained remarkably racially homogeneous; whites made up 97 percent of its residents in 1990. But some nearby Long Island suburbs, like Freeport, had integrated in the wake of the civil rights movement. Nationally, suburbs slowly became somewhat more racially mixed. They also housed a growing number of immigrants who moved directly to suburban areas, bypassing the central cities that in the past had been the main entry points to the country. During the 1990s, more Central Americans lived on Long Island than in New York City.

  The vast Inland Empire east of Los Angeles, with well over three million people in 2000, epitomized the diversity that had come to characterize many suburban region
s. A center for warehousing and transshipping goods entering the country through the ports of Los Angeles and Long Beach (including the flood of goods from China on their way to Wal-Mart stores across the country), the Inland Empire attracted a racially and ethnically mixed working class looking for local jobs and affordable housing as well as more upscale professionals and white-collar workers, many of whom undertook long commutes to Los Angeles. At least for a while, cheap homes and cheap mortgages made a suburban way of life possible for a broad cross section of the population. (When the subprime mortgage crisis hit in 2007, the Inland Empire was one of its epicenters.)

  Still, there remained in the DNA of suburbia an impulse for exclusivity and escape from racial, ethnic, and economic diversity. The most extreme manifestation came in the spread of gated communities, developments physically enclosed by walls or fences with access restricted to residents and their guests. Such developments could be found as far back as the nineteenth century, exclusive communities for wealthy families. In the 1960s, gated communities for the upper middle class began to be built, initially retirement or second-home resort communities, generally in parts of the country with year-round warm weather. Twenty years later, they began to be marketed as primary residences for families with working adults. Easy credit from deregulated savings and loans financed many of the gated developments, which often offered the good life in the form of swimming pools, tennis courts, fitness centers, landscaped or wooded grounds, and golf courses. By 2000, just over four million households (3.4 percent of the total) lived in communities with controlled access, while another three million lived in developments surrounded by walls or fences but without gatehouses or electronic gates.

 

‹ Prev