Book Read Free

The Big Sort

Page 14

by Bill Bishop


  As with the decline in trust, however, the alignment of right-leaning political parties with churchgoing wasn't something made only in America. It happened everywhere. The most religious people in every industrialized country have come to support the political party on the right. It seemed to come as a surprise to Americans after the 2000 election that those who attended church once a week were overwhelmingly Republican. But there wasn't anything unusual about that relationship. A survey of thirty-two countries in the late 1990s found that seven out of ten of those who attended church once a week voted for the political party on the right. In fact, church attendance in all industrialized societies is the best predictor of right-leaning political ideology.†71

  The United States is peculiar among these nations only in that there are so many churchgoers. Pippa Norris and Ronald Inglehart combined different polls to find that the percentage of people who expressed a belief in God has declined in seventeen of nineteen countries over the past half century. Typically, Scandinavian nations have had the largest percentage of people drifting away from the pews. The two countries that haven't experienced a significant diminishment in belief are Brazil and the United States. When it comes to the number of people who believe in God and attend church regularly, Norris and Inglehart wrote, the United States is, statistically speaking, "a striking deviant case."72

  Over time, the ties between churchgoers and the party on the right have weakened in most industrialized countries, but not in the United States. The unusual thing about this country has been the stubborn and quite strong connection between religious belief and political party—a cultural peculiarity that, in the post-materialist politics of values, has allowed computer technicians in Orange County to find common cause with West Virginia coal miners and truck drivers.73

  6. THE ECONOMICS OF THE BIG SORT

  Culture and Growth in the 1990s

  Opportunity, not necessity, is the mother of invention.

  —JANE JACOBS

  "An Inexplicable Sort of Mass Migration"

  THE Baton Rouge Advocate ran a series of stories in 2002 titled "Leaving Louisiana"—and people were. They were hoofing it from Louisiana by the hundreds of thousands long before Hurricane Katrina washed, rinsed, and tumbled out those who remained. In the flow of people back and forth across the state line, Texas cities alone had a net gain of 121,000 Louisianans between 1992 and 2000. Most went to Houston or Dallas, but a good number migrated to Austin. There were enough Louisiana expatriates to turn the Shoal Creek Saloon into home away from bayou home—gumbo on the menu, a huge Saints football helmet on the roof, and the hated Cowboys banned from the television.

  The migration of people and money throughout the United States in the 1990s created a stark pattern. Some cities were sucking up people and income. Others were flinging them out with what appeared to be centrifugal force. Portland (I'm talking about Oregon throughout this chapter), Seattle, Dallas, and Austin gained at the same time the Cleveland Plain Dealer described the depopulation of its city as a "quiet crisis" and the Baton Rouge Advocate published its series. Dave Eggers, in his 2000 autobiographical book A Heartbreaking Work of Staggering Genius, called the movement of his educated and young midwestern friends to San Francisco "an inexplicable sort of mass migration."1 Actually, it was perfectly explicable. Eggers and his heartland buddies weren't the only ones switching addresses. As many as 100 million Americans resettled across a county border in the 1990s. People didn't scatter like ants from a kicked-over hill. There was an order and a flow to the movement—more like the migration of different species of birds. Eggers and his flock landed in San Francisco. A larger group of people—with a very different view of the promised land—migrated to Las Vegas. Economies, lifestyles, and politics merged in the Big Sort. Choices about lifestyle changed regional economies. And those differences in local development were reflected in a place's politics. The Big Sort wasn't happening at the state level. Rather, communities within the same state showed entirely different patterns of development and growth. The U.S. economy, its culture, and its politics were changing town to town, city to city.*2

  The picture the United States had of itself in the 1950s, 1960s, and 1970s was of a nation that was increasingly becoming one—religiously, racially, politically, and economically. From the 1950s through the mid-1970s, communities did grow more politically integrated. There was an economic convergence, too. The South—once "the nation's number one economic problem," according to Franklin D. Roosevelt—had become the beaming Sun Belt. Wages in different parts of the country began to converge. People with college degrees were "remarkably evenly distributed" among America's cities, according to Harvard University economist Edward Glaeser.3

  If such economic, partisan, and educational balance was the American way, by 1980 a decidedly un-American trend began. Places stopped becoming more alike and began to diverge. The economic landscape stopped growing flatter, and, in Richard Florida's description, it got spikier.4 The country got particularly spiky after 1980 as Americans segregated by education. In the last thirty years of the twentieth century, education levels surged nationally. In 1970, 11.2 percent of the population had at least a college degree. That figure increased to 16.4 percent in 1980, nearly 19 percent in 1990, and 27 percent in 2004. But as the national totals of college-educated people grew, education differentials between cities widened with each decade. The variance among cities was astounding. The percentage of adults with a college education increased in Austin from 17 percent in 1970 to 45 percent in 2004. In Cleveland, the change was only from 4 percent to 14 percent. Not only was Cleveland behind, but it was falling further behind. Schooling attracted schooling, as people with degrees moved to live among others with the same level of education. By 2000, according to Glaeser, there were sixty-two metropolitan areas where less than 17 percent of adults had college degrees and thirty-two cities where more than 34 percent had finished college.5 The differences were even more dramatic among the young. More than 45 percent of twenty-five- to thirty-four-year-olds in Raleigh-Durham, North Carolina, had a college degree in 2000; that figure was only 16 percent in Las Vegas.6

  Education had always predicted city growth, but beginning in the 1970s, that relationship strengthened.7 The cities that grew the fastest and the richest were the ones where people with college degrees congregated. (Fast-growing Las Vegas was obviously an exception.) As people with different levels of education sorted themselves into particular cities, the migration pattern set off segregation by income. Average city wages, which were converging in the 1970s, grew more unequal throughout the 1990s.*8 The per capita income of the ten metro areas with the best-educated residents rose 1.8 percent a year in the 1990s. The per capita income of the ten cities with the least educated population grew only 0.8 percent a year.9 Segregation by education was particularly apparent in rural areas. By 2000, the percentage of young adults with a college degree in rural areas was only half that of the average city.10

  This was the Big Sort of the 1990s. Every action produced a self-reinforcing reaction: educated people congregated, creating regional wage disparities, which attracted more educated people to the richer cities—which further increased the disparity in regional economies.* The Big Sort was just beginning with education. Or was it ending with education? It was hard to tell. By the turn of the twenty-first century, it seemed as though the country was separating in every way conceivable.

  Race

  An astounding 40 percent of the country's 320 metropolitan areas lost white population in the 1990s. The common notion of "white flight" is as a Caucasian escape from the central cities to the suburbs. In the Big Sort, however, there was a wholesale shift of white residents from one set of cities to another. Whites fled two kinds of cities. They abandoned older factory towns in the North and Midwest. Pittsburgh, Detroit, Buffalo, Hartford, Providence, Cleveland, Milwaukee, Jersey City, and Newark all lost tens of thousands of white residents. Whites also left the nation's largest cities, some of which were growing increas
ingly expensive: Los Angeles, New York, San Jose, Chicago, and Philadelphia (as well as Orange County, California).

  Whites went to high-tech cities: Atlanta, Phoenix, Denver, Portland, Austin, Dallas, Raleigh-Durham, Seattle, Minneapolis, and Boise. And they filled retirement or recreational cities: Las Vegas, West Palm Beach, Orlando, and Tampa. Blacks, meanwhile, moved to cities with strong black communities: Atlanta, Washington, New York, Chicago, Houston, Dallas, Fort Lauderdale, Baltimore, and Philadelphia. Only 9 out of 320 cities lost black residents.

  Age

  In 1990, young people were evenly distributed among the nation's 320 cities. By 2000, twenty- to thirty-four-year-olds were concentrated in just a score of cities. Some 124 American cities had a net loss of Generation Xers in the 1990s, and more than 700,000 young adults left rural America.* If Hispanics are excluded, 170 U.S. cities—more than half—lost young people in the 1990s. (Young Hispanics increased by 8 percent nationally, decreasing in only seven metro areas.) Eighty percent of the non-Hispanic whites ages twenty to thirty-four who moved during the 1990s relocated to the twenty-one cities highest in technology and patent production.†

  Young people are more likely than old people to move. And young, educated people are more likely to move farther and more often than are young, less educated people. According to one longitudinal study covering most of the 1980s and 1990s, only 19 percent of young people with only a high school degree moved between states, but 45 percent of those with more than a college education migrated to a new state.11 Young people moved disproportionately to central cities. The likelihood that a twenty-five- to thirty-four-year-old would live within three miles of a city center increased significantly in each of the fifty largest metro areas during the 1990s.12 Older people, meanwhile, clustered in the country's least dynamic (economically and technologically, at least) cities. The 119 cities producing the fewest patents had the highest proportion of people age sixty-five or older.

  Ideas

  It's not easy to trace ideas as they arise and then become economically useful. (Education is one surrogate measure.) In the Unites States, however, you can track patents. From the 1970s to the 1990s, the average number of patents granted each year rose nearly 50 percent. But the distribution of those patents was vastly unequal, and that lopsidedness contributed to the growing regional wage inequality. In the 1990s, people who lived in the places that produced the most patents earned higher wages than those who lived in the places that produced the fewest patents—not so surprising. Harvard University's Michael Porter calculated that 30 percent of the variation in wages across regions could be statistically related to differences in patent production.13 Between 1975 and 2001, San Francisco's yearly patent production increased nearly 170 percent. Patents in Minneapolis rose 116 percent. Atlanta was up well over 200 percent, Portland 175 percent, and Seattle 169 percent. Little Boise, Idaho, increased its patent production by a factor of ten. In the 1970s, Austin produced fewer patents each year than Greenville, South Carolina; Davenport, Iowa; Elmira, New York; and Lancaster, Pennsylvania. The yearly production of patents in Austin jumped from 75 a year in 1975 to more than 2,000 by 2001—a twenty-seven-fold increase in twenty-seven years. In 1999, Austin produced more patents per capita than all but Rochester, New York, and Minneapolis. Meanwhile, other cities lagged: patents in Cleveland were down 13 percent and in Pittsburgh 27 percent.

  Wages

  This sorting and clustering of highly educated people had an inevitable effect on wages. By 2000, Michael Porter found "striking variation in average wages" across economic regions, with average pay ranging from just over $19,000 a year in western Nebraska to over $52,000 in San Francisco.14 Wages during the 1990s increased 7.1 percent a year in Austin, but only 1.8 percent a year in Wheeling, West Virginia.15 Growing wage inequality tracked increasing political polarization, according to political scientists Nolan McCarty, Keith Poole, and Howard Rosenthal. The nation's income distribution grew more unequal in parallel with the rising partisanship in Congress.16

  Occupation

  Richard Florida was a professor of regional development at Carnegie Mellon University when he noticed a switch in the way businesses went about hiring new workers. Instead of people moving to corporations, corporations had begun moving to where pools of talent were deepening. Florida, Kevin Stolarick, and a group of researchers at Carnegie Mellon identified a new class of workers. They called them "creatives." These were top managers, artists, writers, engineers, and teachers. They thought for a living. Florida tracked the increasing numbers of these workers over time, and then he tracked them across space.*

  Creative-class workers were sorting themselves into the same cities that, according to our analysis, were also producing most of the country's patents. The top five cities in percentage of creative-class workers were Washington, Raleigh-Durham, Seattle, San Francisco, and Austin. Las Vegas had the lowest percentage of creatives, followed by Miami, Memphis, and Louisville. Florida found that the cities teeming with creative-class workers had fewer working-class jobs—the assembly-line, mechanical, construction, and production jobs traditionally called blue-collar work. The cities with the most service jobs (Las Vegas being at the top) also lacked creative-class workers. There simply wasn't much overlap. Few cities had both a sizable working class and a large creative class. The same kind of segregation was happening at work. Older firms—General Motors and U.S. Steel—hired people with all manner of skills. In the economy being created, there were high-skilled firms and low-skilled firms—Google and McDonald's. People with skills and education were less likely to work for a business that hired employees without a college degree.17 Florida concluded that the American "working population is re-sorting itself geographically along class lines." The sorting had little to do with region. There were creative cities in the Sun Belt (Atlanta and Raleigh-Durham), the Frost Belt (Minneapolis and Rochester, New York), and the heartland (Chicago, Dallas, and Denver). But cities increasingly differed in the types of workers they attracted. And this differentiation, Florida warned, would lead to a "new form of segregation."18

  That is exactly what happened.

  The FedEx Truck Doesn't Stop Here Anymore: The Other Side of the Big Sort

  The cast of Harlan County, Kentucky's community play Higher Ground consisted of retired coal miners, teachers, bluegrass musicians, and members of church choirs. They had assembled in the late summer of 2005 in a darkened auditorium at the community college to work through some scenes. This night they were to begin work on what would come to be known as the "drug zombie dance." In the scene, a doctor sat in a chair as the chorus stumbled and staggered onstage. They were the drug zombies who had come to the doctor for pain pill prescriptions. As the doctor wrote on little slips of paper, the zombies sang—chanted, really—"I've got a pain; I've got a pain; I've got a pain in my back. And I'm searching for a cure to take my pain away." The zombies passed money to the doctor, who tossed dollar bills in the air as police sirens began to whine.

  Most small towns put on community plays to celebrate their founding by brave pioneers or a battle won by stalwart local soldiers. They commemorate "little engine that could" determination that leads to inevitable civic success. In 2005, however, Harlan County was producing a community play about civic failure—about the county's battle with drug addiction, primarily the painkiller OxyContin. It was a struggle the county had so far lost.

  Harlan County sits in the extreme southeast corner of Kentucky and is perhaps the most infamous coal community in the country. For most of the last century, Harlan County was an outpost of industrial America. Ford, U.S. Steel, and International Harvester all had mines there. In the 1930s, "Bloody Harlan" became the center of union organizing efforts by both the Communist Party and the United Mine Workers of America. And in the 1970s, it was the scene of another mine strike chronicled in the Academy Award-winning film Harlan County U.S.A. Fortune, however, has not accompanied fame. Close to 80,000 people lived there in the 1940s. Now the population is roughl
y one-third that size and dropping with each census.

  Traveling the eastern coalfields is a reminder that the most abundant product of the Big Sort has been inequality. Sixty years ago, the proud city of Welch, in southern West Virginia's McDowell County, was a "little San Francisco," local historian Jean Battlo told me. Guy Lombardo and Glenn Miller played Welch, and on a Saturday afternoon, the city was crowded with pedestrians and Packards. McDowell County was at the core of industrial America's economy. Then it wasn't. Three-quarters of McDowell's people left between 1950 and 2000. It lost nearly 10 percent of its population in the first four years of the twenty-first century. "Rational people leave, if they can," Jerry Beasley, president of nearby Concord College, told me. There's always been an economic and cultural distance between the small towns in the coalfields and urban America. But the gap has been growing, and it's now almost unimaginable that Welch and Austin are part of the same country.

  It's not just Harlan County and Welch that have lost ground. The Big Sort has left much of rural America behind. The realities of rural life come to light in the list of Americans killed in Iraq and Afghanistan. Bob Cushing and I began tracking the hometowns of those killed in the war when the conflict began, and our very first tallies revealed that the U.S. military was disproportionately filled with young men and women from rural counties. The bigger the city, the smaller the percentage of its young people were likely to die in the war. By late 2006, rural counties had casualty rates 60 percent higher than cities and suburbs.19 By early 2007, Bismarck, North Dakota, had a casualty rate among its military-age citizens that was almost ten times that of San Francisco. The Pentagon is straightforward in its recruiting strategy. The military finds the highest proportion of its recruits among good kids who have few prospects for decent jobs or further education. Or, as a Department of Defense study put it in reverse, "propensity to enlist is lower for high-quality youth, youth with better-educated parents, and youth planning to attend college."20 There has been a form of economic conscription at work in the wars in Iraq and Afghanistan. The death rate among military-age residents living in the nation's high-tech cities has been half that of military-age people from rural America.

 

‹ Prev