Fault Lines

Home > Other > Fault Lines > Page 29
Fault Lines Page 29

by Kevin M. Kruse


  The Real World

  Despite the Republican zeal for impeachment, the party proved to be badly out of step with the public. Cable networks won increased ratings, but in the grand scheme of things, a scandal that revolved around the private sexual relationships of a president didn’t resonate with many Americans. If the country had shifted to the right on many domestic political issues, the liberalized cultural trends of the 1960s seemed to have won out by the 1990s.

  One of the most popular radio talk shows of the decade was the syndicated program of Howard Stern. Originally the host of a late-night program on WNBC in New York, Stern featured frank talk about sex, drugs, and everything else that the Moral Majority railed against. “The idea of the show is to convey real honesty on the air,” Stern explained in a 1991 interview, “to get away from the phony type of broadcasting where they bite their tongue and are afraid to say anything.” 11 By 1992, Stern’s program had spread across the country, becoming the number-one morning radio show in three major cities—New York, Los Angeles, and Philadelphia.12 As Stern’s empire grew, the Federal Communications Commission tried to rein him in, levying fines of over $100,000 in some instances.13 But the owner of the company that syndicated Stern’s show, Infinity’s Mel Karmazin, was happy to pay the fines since Stern generated so much in profits. With 15 million daily listeners in sixteen major cities, the shock jock brought in an estimated $20 million per year to the company. By 1995, his program was the number-one morning show in most top markets and a hit with men eighteen to thirty-four.14

  The success of the outrageous radio host reflected broader trends sweeping across American culture, as sex and violence became ever more prominent in video games, movies, and television. A study from the University of California at Los Angeles reported that “sinister combat violence” could be found in most Saturday morning cartoons, while 42 percent of the movies broadcast on network television were inappropriately violent.15 According to another survey, two-thirds of prime-time programs featured sexual material, while daytime shows like The Jerry Springer Show, launched in 1991, offered discussions of once-taboo topics like bestiality and incest.16 The erotic thriller Basic Instinct became a blockbuster success at the box office in 1992, ultimately earning $350 million and inspiring a stream of other sexually explicit films like Consenting Adults, Disclosure, Showgirls, and Striptease.17 Meanwhile, MTV quickly found that the most lurid music videos in its rotation were in the highest demand.

  The network also started to introduce original programming, starting with The Real World, a new style of “reality programming” that debuted in 1992. Copying the 1970s PBS documentary series An American Family, which had chronicled the dysfunction and eventual divorce of a seemingly wholesome white suburban family, MTV’s show placed a diverse set of young Americans in an apartment for a few months, all wired for video and sound. Despite its generally lighthearted style, the show occasionally tackled serious subjects. The third season, for instance, featured Pedro Zamora, a 22-year-old Cuban immigrant who had come to the country as part of the 1980 Mariel boatlift, and who was now suffering from AIDS. Zamora had long been an activist and hoped his role on the show might publicize the AIDS crisis and humanize its often-ostracized victims. As he had hoped, the show raised awareness for many younger Americans, who took lessons about safe sex into their own lives. “It’s his story and everybody else’s that make you think. I know I wouldn’t do it without a condom,” said one 15-year-old.18 When Zamora died in November 1994, President Clinton noted that the show had changed the face of HIV and AIDS in America forever.” 19

  While the real-life struggles of Pedro Zamora did much to dramatize and humanize the plight of HIV/AIDS victims, fictional television programs increasingly showcased gay and lesbian characters in a more sympathetic light and in more central roles as well. In 1997, comedian Ellen DeGeneres, star of the popular ABC sitcom Ellen (1994–1998), came out publicly as a lesbian—first in real life, and then again as the character on her show. The decision generated a great deal of publicity and positive press for the show, with a cover story in Time magazine titled “Yep, I’m Gay.” But it also sparked a significant backlash from the Religious Right. Jerry Falwell mocked the star as “Ellen Degenerate,” while the Family Research Council urged conservative Christians to boycott the show’s advertisers. ABC began running “adult content” warnings before the program and then canceled the show altogether at the end of the season. That same year, NBC launched Will & Grace (1998–2006), another show with a gay lead character—though, in this case, one played by a straight actor, and with his sexuality downplayed considerably. “Network executives reportedly got queasy and now Will seems not so much gay as neutered,” one critic noted of its debut. “He’s not only a eunuch, he’s a lawyer, and a corporate lawyer at that; how much more sexless can anyone get?” Though initially cautious, the show demonstrated that gay and lesbian themes could find a footing even on network television, in the midst of a concerted conservative backlash.20

  Meanwhile, on cable channels, even franker discussions of sex and sexuality were becoming commonplace. The HBO show Sex and the City (1998–2004), for instance, centered on the relationship struggles of four single women in their mid-30s and 40s. Framed through the perspectives of its lead character Carrie Bradshaw, a New York City newspaper sex columnist, the program advanced a cynical view of modern romance. “Welcome to the age of uninnocence,” she noted in one voiceover. “No one has breakfast at Tiffany’s and no one has affairs to remember. Instead we have breakfast at 7 a.m. and affairs we try to forget as soon as possible.” The program’s perspective quickly made it the top-rated comedy on cable television and a national phenomenon. “Sex and the City” tours started up in New York. Across the country, groups of women gathered for viewing parties, taken in by the show’s frank discussions of sex and its focus on the woman’s perspective. “I love that on the show they talk about men the way that men talk about women,” noted a fan in Atlanta. “It turns the tables, and I try to think of how I can do that in my own life.” 21

  Frank discussions of sex and sexuality shaped even apparently staid subjects like the computer boom. As had happened with the advent of videocassette recorders, new advances in computing were regularly wedded to the porn industry. One of the best-selling computer discs in 1986, for instance, was Virtual Valerie, a game that revolved around simulated sex. Even early advances on the internet were fueled by pornography, which introduced cutting-edge mechanisms like streaming video and popularized new browsers for the World Wide Web such as Mosaic and Netscape Navigator. The growth of the internet facilitated the spread of personal computers from three hundred in 1981 to a million by 1993 and then about 37 million by 1996 and 83 million by 1998.22 Sex, once again, was often an important part of this process. Internet porn sites were receiving about 50 million hits per month by 1999, generating over $1 billion in revenue per year.23 There were efforts to regulate the industry, particularly in response to news of child pornography rings. But regulating this new technology became extraordinarily difficult because in the internet age, access came from anywhere in the world.

  Set against this increasingly permissive climate, Republican attacks on Clinton for his sexual infidelity simply failed to resonate. Polls showed that 63 percent of Americans disapproved of the campaign to impeach the president, a dissatisfaction that made itself felt in the 1998 midterm elections. Defying the norm in which the opposition party would typically gain seats in the midterms, the GOP actually lost ground in the House. The results were so shocking that they sparked an internal rebellion resulting in Newt Gingrich’s resignation as Speaker. (Gingrich’s standing had already been weakened by the revelation that he too was having an extramarital affair.) Despite the warning sign from voters, House Republicans remained committed to the impeachment cause. But voters recoiled again. In the week after the full House voted to impeach the president on December 19, 1998, Clinton’s favorability leapt ten points in the Gallup Poll, to his all-time high of 73
percent approval. Meanwhile, the Republican Party’s approval rating sank to 31 percent. Reading the public’s mood better than their colleagues in the House, Republicans in the Senate soft-pedaled the impeachment trial, leading to Clinton’s acquittal in February 1999.24

  Boom and Bust

  In many ways, the impeachment campaign failed because it simply couldn’t compete with a bigger story: the booming economy of the late 1990s. Even though Americans had been titillated by the sordid details of the Lewinsky affair and transfixed by the political fallout, their general happiness with the status quo led them to resist Republican calls for the president’s removal. In January 1999, when the Senate deliberated the matter, Gallup found that 70 percent of all Americans reported being satisfied “with the way things are going in the United States”—a dramatic improvement on the 14 percent that had said so in June 1992. That satisfaction with the status quo largely reflected their satisfaction with the economy. In January 1999, 69 percent of Americans rated the economy as either “excellent” or “good”; in June 1992, only 12 percent had.25

  The unbridled optimism of the moment was captured in a game show that became a phenomenon: Who Wants to Be A Millionaire? An import from England, the show essentially replicated the format of a 1950s hit, The $64,000 Question, though this time the expectations were much higher and the potential payoff, of course, much grander. A top ratings draw in 1999 and 2000, the show soon became the cornerstone of ABC’s programming. Indeed, the network was so excited about the ratings bonanza that it put the show on several nights a week, breaking the tradition of reserving one slot for hit shows such as this. At its peak, up to 30 million people tuned in each night.

  The show’s popularity spoke to a nation reveling in an economic recovery. The Dow Jones Industrial Average more than tripled between 1993 and 2000, booming from roughly 3,600 to 11,000. As impressive as those gains were, they were nothing compared to what was taking place on the NASDAQ, an exchange that focused more directly on the tech companies that drove the decade’s growth. In January 1993, the NASDAQ stood at about 670; in January 2000, it stood at about 4,100—an amazing sixfold increase. Everyone seemed, on the surface at least, to be doing well. The percentages of Americans who were poor or unemployed had reached new lows by the end of the 1990s. By April 2000, the unemployment rate in America had been reduced to just 3.8 percent, the lowest rate in more than thirty years. In his State of the Union address that year, Clinton reflected the euphoria that most Americans had about the economy. “Never before,” the president proclaimed, “has our nation enjoyed, at once, so much prosperity and social progress with so little internal crisis and so few external threats.” 26

  Silicon Valley, the San Francisco Bay area home to the high-tech boom, quickly became the symbol of the late-1990s economy. As private capital poured funds into the surging computer industry, young professionals made huge amounts of money at rapid speed, becoming multimillionaires and, moreover, symbols of a resurgent America. Steve Kirsch exemplified the trend. As the CEO of Frame Technology, he made $30 million before selling his firm, at just 38 years of age, for $500 million more. He soon started another company called Infoseek, one of seven businesses he would found in rapid succession. Another young “serial entrepreneur,” Kurt Brown, described the process as “like a roller coaster. The first time through you get sick and say, ‘What am I doing?’ Then you get off, a little woozy, and you need some time to let your lunch settle. But then you just want to get back on the roller coaster again—a bigger one.” 27 Similar stories came out of Silicon Valley. In 1994, a 22-year-old named Marc Andreessen joined forces with Jim Clark, who had more experience in the tech sector, to found a company called Netscape, whose products would allow computer users to search for information on the internet. In 1998, after having appeared on the cover of Time magazine, Andreessen and Clark sold their company to America Online for $4 billion in stock.

  They and others constituted, according to one journalist in the New York Times, a “loose tribe of young moguls, minimoguls and moguls in waiting, a brat pack whose members play together and endlessly debate business strategies among themselves.” 28 Though women had constituted a significant part of the computer programming industry, Silicon Valley felt very much like a boys’ club in the 1990s, in every sense of the term. Many drove extraordinarily expensive cars, flew private jets, and owned huge homes with every luxury imaginable. The desire for more kept growing as the money kept flowing into their coffers. “To feel truly rich in Silicon Valley,” one of them said, “you have to be worth in the three-digit millions.” 29 Not to be outdone, Massachusetts launched its own high-tech corridor along Route 128, while New York mimicked the California boomtown with its own version, “Silicon Alley.” 30

  Across the country, between Silicon Valley and Silicon Alley, American cities were being remade. The most famous renovation of all was Times Square in New York. A stark symbol of decay in the 1970s, it underwent dramatic transformations in the 1990s. Under Mayor Rudy Giuliani, elected in 1993, the city completed the long process of remaking the area that had once been a center for dingy porno theaters and “peep shows” into a new site for family-friendly entertainment and upscale commerce. At one level, the NYPD implemented a harsh crackdown on the vast criminal world that had thrived on these streets for decades. Sex shops were closed; drug dealers were arrested. Meanwhile, the city government used zoning laws and tax incentives, as well as the sale of property, to bring in new businesses. The development of the area accelerated as a number of major companies, such as Disney and MTV, purchased huge swaths of real estate, and the police ensured that the millions of tourists coming there would feel as safe as they were in Disney World. The neighboring Bryant Park on 42nd Street emerged as a venue for wealthier city residents whose numbers were growing as more neighborhoods were converted to high-priced condos and co-ops. Indeed, half of the increase in income between 1992 and 1997 in New York came from people working in the financial service sector, even though they represented less than 5 percent of the workforce. The number of city residents who were categorized as being in the middle class, meanwhile, steadily declined from 35 percent in 1989 to 29 percent ten years later.31

  As New York made clear, the economic recovery of the 1990s was anything but evenly distributed. Upper income brackets boomed as those on the lower end of the income ladder struggled to stay in place. In the offices that housed many of the thriving companies, the borderlands of the new economy were clear, especially in Silicon Valley. Rosalba Ceballos went to work each day in the manufacturing giant KLA-Tencor to clean offices and sweep floors, only to return home to a tiny, cramped garage outside Palo Alto where she lived with her three kids. Maria Godinez, who worked as the janitor at Sun Microsystems, lived in a house packed with four different families, totaling twenty-two adults and kids. “Unfortunately,” said the head of the AFL-CIO division in the Silicon Valley, “the New Economy is looking a lot like an hourglass with a lot of high-paid, high-tech jobs at the high end and an enormous proliferation of low-wage service jobs at the bottom.” 32

  Indeed, for all the attention given to the wars of Washington in the 1990s, the most pronounced division in the nation was neither cultural nor political, but economic. The pattern established in the 1980s—in which the rich became richer, the middle class shrunk, and the poor sank further behind—accelerated in the boom years of the 1990s and continued well beyond. Over these decades, the gap between the wealthiest and poorest Americans steadily grew. According to one study, the top 1 percent of the nation received more than 80 percent of the total increase in America’s income between 1980 and 2005, nearly doubling their overall share of the nation’s wealth. The Economic Policy Institute, meanwhile, found that CEO wages relative to those of their workers skyrocketed: in 1965, the average CEO made twenty-four times as much as an average worker; in 1978, thirty-five times as much; in 1989, seventy-one times as much. The numbers continued to soar, and by the end of the 1990s, the average CEO was making three h
undred times as much as the average worker.33

  With economic inequality came economic insecurity. As union jobs vanished, the new economy increasingly revolved around low wage jobs that did not provide much stability. The number of households whose income fell by 25 percent or more within one year, according to the political scientist Jacob Hacker, had steadily risen since the 1980s. The loss of income came from either declining wages or higher medical expenses. In the 1970s, about 3–4 percent of the population was expected to suffer through a fall in their income of 50 percent or more; that segment would reach almost 10 percent by 2004. More and more Americans declared bankruptcy, with filings increasing by 95 percent between 1990 and 1998, breaking historic records several times. With each downturn, the “new normal” that Americans could expect became worse.34

  Even the economic improvement of the mid-1990s did not reverse the long-term trends toward underemployment in several parts of the country. By 1996, there were over 36.5 million Americans under the poverty line. Notably, these individuals were largely concentrated in central cities. National recoveries did not matter much for the residents there, as the jobs simply never came to their neighborhoods. Unemployment rates for minority youth living in cities were five times the average of those for white youth in other parts of the country. The pressures were amplified by an influx of immigrants, who increased urban populations. While overall crime rates diminished across the 1990s, urban levels remained extremely high.35 During the decade, cities saw dramatic rises in the number of single-parent families and extremely high rates of infant mortality among African Americans. One study found that half of the poor children under six were in families whose income didn’t even reach half of the poverty line. Among the African American workforce, 9.9 percent were unemployed in 1998 compared to 4.7 percent of the national workforce.36

 

‹ Prev