As the media conglomerates grew stronger, labor’s position became more precarious.17 Most writers saw deregulation as a step backward, a loss of creative control, and a harbinger of what writing would be like in this new conglomerated landscape. Looking back on his own career, Tom Fontana saw a marked effect on his partnership with networks and on the kind of television that could be greenlighted: “A lot has shifted over the last thirty years that I’ve been doing television—one [part of it] is the corporate structure. The death of the small studios like MTM, Lorimar—that has had an extraordinarily compelling, negative effect on the quality of television. Now there are only six or seven studios, and most of them have a network that they have to feed. So their appetite becomes very narrow.”18
Like the demise of Fin-Syn, the 1996 Telecommunications Act received little attention in the popular press. In this dramatic overhaul of the Communications Act of 1936, the FCC, under the leadership of Michael Powell (son of future US Secretary of State Colin Powell), pushed for a deregulatory agenda more reminiscent of the Reagan era than that of President Bill Clinton. The act relaxed television station ownership limits and cross-ownership restrictions, revoked regulations on cable ownership established by the Cable Act of 1992, granted the FCC jurisdiction over digital broadcast satellites, and assigned existing television stations digital allotments without charge.19 Audience fragmentation and the multiplexing of network channels (NBC, MSNBC, CNBC, and so on) left the cable television landscape looking like a corporate-controlled digital land rush. Digital satellite services, which began in 1990, expanded throughout the decade, improving not just picture and sound quality but also increasing the number of channels available to the average home consumer (even if most of these channels were owned by the same few conglomerates). Although the Guild failed to fight deregulation effectively, its members would take the lessons learned from this period to plot a more forward-thinking approach to regulation in years to come. But the conglomerates were also learning from the gains of the independent industry outside their control.
The astonishing success of sex, lies and videotape in 1989, a film that Steven Soderbergh wrote, directed, and edited, heralded a golden age for the “indie film” and the hybrid filmmaker. New work by John Sayles, Joel and Ethan Coen, David Lynch, Jim Jarmusch (Mystery Train), and Spike Lee (Do the Right Thing) generated increased energy and creative opportunities for writer-directors operating outside the traditional Hollywood studio structure. Other talented hybrids whose first major successes premiered around the same time include Gus Van Sant (Drugstore Cowboy), John Singleton (Boyz n the Hood), and Quentin Tarantino (Reservoir Dogs). A second wave appeared in the mid 1990s, with Alexander Payne (Citizen Ruth), Ang Lee (The Wedding Banquet), Kevin Smith (Clerks), Todd Solondz (Welcome to the Dollhouse), Nicole Holofcener (Walking and Talking), Wes Anderson (Rushmore), and Paul Thomas Anderson (Boogie Nights). Emanuel Levy details how Miramax and New Line Cinema were the first companies to establish the independent studio business model, creating mini-major studios with steady financial backing, no ties to established studios, and dynamic personalities at the top who defined the corporate persona.20
The success of these mini-majors prompted the major studios to establish specialty branches of their own studios—Focus Features (Universal), Fox Searchlight, and Paramount Vantage—that enabled them to purchase compelling packaged film projects at low prices. These boutique auxiliary units focused on prestige indie films that created subsidiaries of corporate brands for these conglomerates, brands that shrewdly defined themselves as “anti-studio.”21 At the same time, some writers attempted to stay as independent as possible. Gus Van Sant, who made Drugstore Cowboy with Miramax and has since worked with specialty studio distributors, compared his experiences working with independent companies versus Hollywood studios: “Hollywood can’t eliminate the creative side, the writers and directors and actors, which I think they’d like to do. . . . The studios think you should be able to just take bits of screenplays and have a technical writer restructure the dialogue. Design a screenplay, give it to the guy who knows how the camera works, then eliminate the director, and just let the camera guy set up the shot. Then have the actors do the performance part. But when they do that . . . there’s always something missing, like the conductor in the orchestra. Things start to go out of time.”22
Audiences may not have noticed the slow shift toward incorporating indies into the conglomerate structure. As Alisa Perren explains, the term “independent” is arguably misleading.23 In this new landscape, “indie” could mean an economic approach, a genre, or both. It can define everything from an independent voice coming from New York but produced by a studio (Do the Right Thing), to edgy independent productions picked up on the film festival circuit and distributed by Hollywood (Slackers, The Blair Witch Project), to a Miramax-style hybrid indie (Fargo, Boogie Nights), or a Hollywood production that only mimics the indie in its genre or style (Juno). Levy describes two definitions of the term “independent”: one specifically speaks to a structure of film financing, the other to the point of view, aesthetics, or vision of the film or filmmaker.24
During the late 1990s, studios began eliminating projects with midrange budgets; suddenly the majority of films greenlighted were either expensive mass-appeal blockbusters or low-budget indie-like productions. As independent films continued to profit commercially through the 2000s, win prestigious awards, and capture a small but significant share of the audience, studios aimed to package and corporatize independent film as a genre. In 2005, Disney purchased the top distributor of independent films, Miramax.25 Although writers looking to distribute their films could find other outlets, this takeover signaled a clear end to an era of robust independent production.
Pushing the Boundaries of Jurisdiction
The categories of writing under the jurisdiction of the WGA expanded rapidly during the 1990s and 2000s. New genres and platforms tested the boundaries of what could and should be included under the union umbrella. Many writers working outside Guild’s purview were eager for their work to be counted as “insider” labor so they could obtain guarantees on minimum pay, benefits, and pension funds. Television producers experimented with reality television and animated primetime series, both complicated territories for television writers and the Guild to navigate. At the same time, audiences in increasing numbers were plugging in their television sets to video game consoles and DVD players to meet their entertainment needs.
Computer-generated imagery (CGI) was first integrated into popular cinema in the 1970s. By the 1990s, CGI’s potential for creating dynamic and innovative film and television cinematography had expanded dramatically. Soon, it transformed the nature of animated film and television. Series such as Fox’s The Simpsons (which started in 1989) and MTV’s Liquid Television (the original showcase for Beavis and Butt-head) transformed viewers’—and studio executives’—perception of CGI’s potential not just for animated entertainment but also for enhancement of the viewer experience.
The overwhelming success of Matt Groening’s The Simpsons in the 1990s encouraged Fox executives to expand the network’s adult-oriented prime-time animation programming.26 In 1999, with The Simpsons, King of the Hill, Family Guy, The PJs, and Futurama on Fox’s prime time schedule, the campaign for writers of animated television to unionize under the WGA mobilized with relative ease. Peter Chernin, who headed Fox at the time, had already assumed that primetime animation was under a WGA deal. When he found out these writers were not covered and that they had no health insurance, he agreed to put them under a Guild contract. Brian Walton, then the president of the WGA West, said, “It is our hope that this farsighted agreement will lay the framework for similar contracts in this genre.”27 David A. Goodman, a writer and producer on Futurama at the time, detailed the fortuitous nature and timing of this deal: “The fact that the shows . . . got one writer to be the Guild liaison and that they were all unified may have played a big role in Chernin saying yes. At that time there w
ere less union difficulties. It was ten years after the previous strike and everything on television was covered under the WGA contract.” Those negotiations came at a moment of relative peace between the union and the studios, that these animated writers were all at one studio, and that the series were successful made it impossible for Chernin to overlook their writers’ individual and collective importance to the network. But “once reality [programming] started to take off,” Goodman said, “that started to tell the studios and networks that . . . everything doesn’t have to be covered under the WGA contract. That was really the chink in the armor.”28 Subsequently, no group of animation writers has ever managed to win such a victory again. And with the increase in primetime reality television, studios had more reason to deny new types of writers Guild privileges.
The effect of reality programming on the media landscape during this twenty-year period rattled television writers to their core. The series Survivor and Big Brother, both European franchises that premiered on American broadcast television in 2000, altered primetime and set a precedent for reality television’s success. Two years later, American Idol became an overnight sensation. The genre’s colonization of primetime hours was a boon to the networks. These series were cheap to produce, and their potential for ratings soon exceeded the best numbers for scripted programming. Reality was not simply a rising new genre; it began to colonize an older one. Documentary writing also evolved. Television documentarian Sydnye White, who wrote written for America’s Most Wanted and NBC Nightly News, argued, “Reality television is making itself felt in the documentary world. Executives are demanding that documentaries have similar interpersonal conflicts and testy atmospheres. Let’s face it, tears pay off. So do fights.”29
The enormous popularity of reality shows gave executives reasons to consider a radical restructuring of their business practices. In 2003 reporter Bill Carter quoted Leslie Moonves, president of CBS Television, declaring the end of the old model of series television: “The world as we know it is over.” Carter spelled out the implications: “Not only will reality shows continue to flood network’s schedules next fall, but television executives are also predicting . . . an end to the traditional television season. Instead of the time-honored formula of introducing shows en masse in September and ending them in May, broadcast networks want to stagger the shows’ debuts and banish repeats. . . . There could also be fewer orders for dramas and comedies, with a resulting shrinking of jobs for Hollywood writers and actors.”30 The advent of programming that could be produced cheaply, show strong ratings, adapt to a variety of production scheduling structures, be rolled out relatively quickly, and get shelved with little loss gave network executives increasing confidence about delaying negotiations with the guilds.
This is not to say that union workers were not employed on these shows. Rather, they found temporary employment on reality series. As the number of feature films and scripted series being shot in Los Angeles and New York decreased, the ranks of writers looking for work grew. Thus, some union writers—along with new writers—tried to make ends meet by working on these “unscripted” series. Even though shooting scripts for reality television often consist of sketches, monologues for hosts, or stories developed after the fact out of captured footage, reality producers inevitably needed professional writers—often now called story producers or story editors—to craft narrative arcs. As one former sitcom writer, now a story producer, said of his work: “I used to write with all the words in Webster’s Dictionary at my disposal; now I write with whatever words Paris Hilton says.”31
Home video game console sales became robust during the 1980s, and video arcade games began luring some young audiences away from movie theaters and live television. In the following two decades, video games matured in content, technology, and market reach. Some game companies designed content to be played on consoles plugged into a television set; others created content for computers and handheld devices. As games became increasingly complex and their story worlds more advanced, game production houses recognized that they could no longer focus solely on game play and graphics. They now needed skilled writers to build narrative and dialogue. In 1996, writer Del Reisman, who was president of the WGA from 1991 to 1993, described the union’s attempts to include new media within its jurisdiction: “We are deeply involved in preparing our members for the exploding interactive media, with frequent seminars, panels, and demonstrations. . . . We are no longer in an institutional industry. New production entities with new sources of financing, new patterns of co-production and new lines of distribution and exhibition are forming and reforming.”32 However, members’ comprehension of video games in the 1990s mostly came from their experiences as players or as parents of players rather than as writers interested in new narrative structures, let alone as employees considering professional opportunities.
Compared with steady improvements in visual and gaming technologies, advances in game writing were often slower to emerge. John Zuur Platten, a WGA member and game writer of Transformers: The Game, recalled: “When I started writing games, the girl answering the phones at the front desk, who took a creative writing class in junior college, wrote the game’s dialogue and didn’t get any extra pay for that.”33 As game writers recount, this disjunction led to a kind of reverse engineering of game narratives. Writers were often called in to consult on projects to fill in narrative gaps or to write dialogue after much of the technological structure and design of the game had already been created.
Like everyone else working within video game production, game writers were not necessarily based in New York or Los Angeles, and so the notion of unionization under the WGA was hardly on their radar. The industry’s work ethic modeled Silicon Valley far more than Hollywood. Even if some writers of these games were WGA members who also wrote for film or television, it was impossible to push the game companies to agree to Guild rules. Some writers thought they might have a better chance of unionizing once the major media conglomerates began making games. As video game writer and WGA member Christy Marx said in 2004, “The changeover will come when more of the big media conglomerates are controlling this stuff. Companies that just come from the game world purely, they’re used to abusing people. It’s common for managers to work people 80 hours a week and run them into the ground. It’s sort of taken for granted in the game world.”34 Because of these attitudes, the unionization of games workers is still in its infancy.
The Digital Versatile Disc, an optical disc storage format, was a revolution for the industry. The combination of cheaper production costs and higher retail value meant that every dollar earned by a VHS was doubled by a DVD sale in the increasingly powerful home video market.35 For writers, the DVD was not just an ancillary marketing tool; because of DVD extras that often included conversations with cast and crew, many writers’ names became more prominent and their faces more recognizable to interested audiences. Writers could be heard in DVD commentaries, could be seen in behind-the-scenes extras, and could communicate directly to the audience via commentary or conversation on websites. In particular, consumers became addicted to DVD box sets of television series, learning along the way about the series’ showrunners, who were promoted in the DVD extras as the real-life heroes responsible for creating or shepherding the beloved series. The Guild tried to take advantage of this trend; an internal memo to WGA executive director Brian Walton from members of the WGA staff in 1995 suggested that the “writer is a box of soap”: “We have a product to sell, the writer. We need to market our products. We know there is a need for our product. We need to establish the brand image of the writer. . . . BENEFIT: establishes writer[s] as superstars in mass media. Also creates economic return for the Guild.”36
Especially for television, this brand-name recognition defies the nature of collective authorship, and yet the industry and the popular press continue to celebrate the individual writer as a new iteration of the American Dream.37 Writer-producer David Goodman remarked: “As a career track, most peo
ple in the country weren’t aware that you could break into it. Now they have heard of these people, they start to identify with them, and they want to be a writer. More people are understanding that there is a man behind the curtain and ‘Maybe I want to do that.’”38 Media studies scholar Denise Mann refers to the persistence of the singular auteur as an obsolete paradigm in a discussion of contemporary quality television production.39 And yet, the DVD box set promotes the writer-auteur of a television series in ways previously never imagined.
All of these developments show how this period in American media industries was an era of prolonged and profound transition, mergers and acquisitions, centralization and further deregulation, significant technological transformation, and concentration of capital. Media scholar Michael Curtin describes how this dynamic impulse created “interactive exchanges, multiple sites of productivity, and diverse modes of interpretation and use.”40 The rise of digital media, the fragmentation of audiences, and the conglomeration of media corporations in the 1990s set American media on new economic, social, political, and cultural paths as they entered a new millennium.
The State of the Guilds
Homogenization is good for milk, but bad for ideas.
—Patric Verrone, president of the WGA West, statement read before the FCC public hearing on media ownership, 3 October 200641
As DVDs grew in popularity, writers realized precisely how much they had lost in their 1988 negotiations. Robin Schiff, who wrote Romy and Michele’s High School Reunion, explained: “Having been in the Guild . . . since 1980, I’ve been around not only for strikes, but also [long enough] to have a sense of how the membership views the Guild—which was incompetent, a joke, not worth standing behind, like a litter of puppies that you couldn’t get together. And in a way, that’s what the companies counted on. There was so much division within our own membership that we could never be effective. You can argue that one guild can’t be effective by itself anyway when seven multinational corporations are basically allowed to negotiate together and have a consortium.”42 Schiff’s frustrations with the Guild echo many WGA members’ concerns that every three years they faced an increasingly powerful force across the table at contract negotiations. To Wall Street’s delight, in 1995 Disney purchased Cap Cities/ABC for $19 billion, creating a conglomerate with a combined calculated worth of $48 billion. A year later, Time Warner purchased Turner Broadcasting System, and Viacom picked up Paramount and, later, CBS for $34.8 billion. With little inflationary pressure, media industry expansion in this New Economy increased rapidly.
The Writers: A History of American Screenwriters and Their Guild Page 26