Book Read Free

The David Foster Wallace Reader

Page 88

by David Foster Wallace


  And, more to the original point, if television can invite Joe Briefcase into itself via in-gags and irony, it can ease that painful tension between Joe’s need to transcend the crowd and his inescapable status as Audience-member. For to the extent that TV can flatter Joe about “seeing through” the pretentiousness and hypocrisy of outdated values, it can induce in him precisely the feeling of canny superiority it’s taught him to crave, and can keep him dependent on the cynical TV-watching that alone affords this feeling.

  And to the extent that it can train viewers to laugh at characters’ unending put-downs of one another, to view ridicule as both the mode of social intercourse and the ultimate art-form, television can reinforce its own queer ontology of appearance: the most frightening prospect, for the well-conditioned viewer, becomes leaving oneself open to others’ ridicule by betraying passé expressions of value, emotion, or vulnerability. Other people become judges; the crime is naïveté. The well-trained viewer becomes even more allergic to people. Lonelier. Joe B.’s exhaustive TV-training in how to worry about how he might come across, seem to watching eyes, makes genuine human encounters even scarier. But televisual irony has the solution: further viewing begins to seem almost like required research, lessons in the blank, bored, too-wise expression that Joe must learn how to wear for tomorrow’s excruciating ride on the brightly lit subway, where crowds of blank, bored-looking people have little to look at but each other.

  What does TV’s institutionalization of hip irony have to do with U.S. fiction? Well, for one thing, American literary fiction tends to be about U.S. culture and the people who inhabit it. Culture-wise, shall I spend much of your time pointing out the degree to which televisual values influence the contemporary mood of jaded weltschmerz, self-mocking materialism, blank indifference, and the delusion that cynicism and naïveté are mutually exclusive? Can we deny connections between an unprecedentedly powerful consensual medium that suggests no real difference between image and substance, on one hand, and stuff like the rise of Teflon presidencies, the establishment of nationwide tanning and liposuction industries, the popularity of “Vogueing” to a cynical synthesized command to “Strike a Pose”? Or, in contemporary art, that televisual disdain for “hypocritical” retrovalues like originality, depth, and integrity has no truck with those recombinant “appropriation” styles of art and architecture in which “past becomes pastiche,” or with the repetitive solmizations of a Glass or a Reich, or with the self-conscious catatonia of a platoon of Raymond Carver wannabes?

  In fact, the numb blank bored demeanor—what one friend calls the “girl-who’s-dancing-with-you-but-would-obviously-rather-be-dancing-with-somebody-else” expression—that has become my generation’s version of cool is all about TV. “Television,” after all, literally means “seeing far”; and our six hours daily not only helps us feel up-close and personal at like the Pan-Am Games or Operation Desert Shield but also, inversely, trains us to relate to real live personal up-close stuff the same way we relate to the distant and exotic, as if separated from us by physics and glass, extant only as performance, awaiting our cool review. Indifference is actually just the ’90s’ version of frugality for U.S. young people: wooed several gorgeous hours a day for nothing but our attention, we regard that attention as our chief commodity, our social capital, and we are loath to fritter it. In the same regard, see that in 1990, flatness, numbness, and cynicism in one’s demeanor are clear ways to transmit the televisual attitude of stand-out-transcendence—flatness and numbness transcend sentimentality, and cynicism announces that one knows the score, was last naïve about something at maybe like age four.

  Whether or not 1990’s youth culture seems as grim to you as it does to me, surely we can agree that the culture’s TV-defined pop ethic has pulled a marvelous touché on the postmodern aesthetic that originally sought to co-opt and redeem the pop. Television has pulled the old dynamic of reference and redemption inside-out: it is now television that takes elements of the postmodern—the involution, the absurdity, the sardonic fatigue, the iconoclasm and rebellion—and bends them to the ends of spectation and consumption. This has been going on for a while. As early as ’84, critics of capitalism were warning that “What began as a mood of the avant-garde has surged into mass culture.”28

  But postmodernism didn’t just all of a sudden “surge” into television in 1984. Nor have the vectors of influence between the postmodern and the televisual been one-way. The chief connection between today’s television and today’s fiction is historical. The two share roots. For postmodern fiction—authored almost exclusively by young white overeducated males—clearly evolved as an intellectual expression of the “rebellious youth culture” of the ’60s and ’70s. And since the whole gestalt of youthful U.S. rebellion was made possible by a national medium that erased communicative boundaries between regions and replaced a society segmented by location and ethnicity with what rock music critics have called “a national self-consciousness stratified by generation,”29 the phenomenon of TV had as much to do with postmodernism’s rebellious irony as it did with Peaceniks’ protest rallies.

  In fact, by offering young, overeducated fiction writers a comprehensive view of how hypocritically the U.S.A. saw itself circa 1960, early television helped legitimize absurdism and irony as not just literary devices but sensible responses to a ridiculous world. For irony—exploiting gaps between what’s said and what’s meant, between how things try to appear and how they really are—is the time-honored way artists seek to illuminate and explode hypocrisy. And the television of lone-gunman westerns, paternalistic sitcoms, and jut-jawed law enforcement circa 1960 celebrated what by then was a deeply hypocritical American self-image. Miller describes nicely how the 1960s sitcom, like the westerns that preceded them,

  negated the increasing powerlessness of white-collar males with images of paternal strength and manly individualism. Yet by the time these sit-coms were produced, the world of small business [whose virtues were the Hugh Beaumontish ones of “self-possession, probity, and sound judgment”] had been… superseded by what C. Wright Mills called “the managerial demi-urge,” and the virtues personified by… Dad were in fact passé.30

  In other words, early U.S. TV was a hypocritical apologist for values whose reality had become attenuated in a period of corporate ascendancy, bureaucratic entrenchment, foreign adventurism, racial conflict, secret bombing, assassination, wiretaps, etc. It’s not one bit accidental that postmodern fiction aimed its ironic crosshairs at the banal, the naïve, the sentimental and simplistic and conservative, for these qualities were just what ’60s TV seemed to celebrate as distinctively American.

  And the rebellious irony in the best postmodern fiction wasn’t just credible as art; it seemed downright socially useful in its capacity for what counterculture critics called “a critical negation that would make it self-evident to everyone that the world is not as it seems.”31 Kesey’s black parody of asylums suggested that our arbiters of sanity were often crazier than their patients; Pynchon reoriented our view of paranoia from deviant psychic fringe to central thread in the corporo-bureaucratic weave; DeLillo exposed image, signal, data and tech as agents of spiritual chaos and not social order. Burroughs’s icky explorations of American narcosis exploded hypocrisy; Gaddis’s exposure of abstract capital as deforming exploded hypocrisy; Coover’s repulsive political farces exploded hypocrisy.

  Irony in postwar art and culture started out the same way youthful rebellion did. It was difficult and painful, and productive—a grim diagnosis of a long-denied disease. The assumptions behind early postmodern irony, on the other hand, were still frankly idealistic: it was assumed that etiology and diagnosis pointed toward cure, that a revelation of imprisonment led to freedom.

  So then how have irony, irreverence, and rebellion come to be not liberating but enfeebling in the culture today’s avant-garde tries to write about? One clue’s to be found in the fact that irony is still around, bigger than ever after 30 long years as the dominant mode o
f hip expression. It’s not a rhetorical mode that wears well. As Hyde (whom I pretty obviously like) puts it, “Irony has only emergency use. Carried over time, it is the voice of the trapped who have come to enjoy their cage.”32 This is because irony, entertaining as it is, serves an almost exclusively negative function. It’s critical and destructive, a ground-clearing. Surely this is the way our postmodern fathers saw it. But irony’s singularly unuseful when it comes to constructing anything to replace the hypocrisies it debunks. This is why Hyde seems right about persistent irony being tiresome. It is unmeaty. Even gifted ironists work best in sound bites. I find gifted ironists sort of wickedly fun to listen to at parties, but I always walk away feeling like I’ve had several radical surgical procedures. And as for actually driving cross-country with a gifted ironist, or sitting through a 300-page novel full of nothing but trendy sardonic exhaustion, one ends up feeling not only empty but somehow… oppressed.

  Think, for a moment, of Third World rebels and coups. Third World rebels are great at exposing and overthrowing corrupt hypocritical regimes, but they seem noticeably less great at the mundane, non-negative task of then establishing a superior governing alternative. Victorious rebels, in fact, seem best at using their tough, cynical rebel-skills to avoid being rebelled against themselves—in other words, they just become better tyrants.

  And make no mistake: irony tyrannizes us. The reason why our pervasive cultural irony is at once so powerful and so unsatisfying is that an ironist is impossible to pin down. All U.S. irony is based on an implicit “I don’t really mean what I’m saying.” So what does irony as a cultural norm mean to say? That it’s impossible to mean what you say? That maybe it’s too bad it’s impossible, but wake up and smell the coffee already? Most likely, I think, today’s irony ends up saying: “How totally banal of you to ask what I really mean.” Anyone with the heretical gall to ask an ironist what he actually stands for ends up looking like an hysteric or a prig. And herein lies the oppressiveness of institutionalized irony, the too-successful rebel: the ability to interdict the question without attending to its subject is, when exercised, tyranny. It is the new junta, using the very tool that exposed its enemy to insulate itself.

  This is why our educated teleholic friends’ use of weary cynicism to try to seem superior to TV is so pathetic. And this is why the fiction-writing citizen of our televisual culture is in such very deep shit. What do you do when postmodern rebellion becomes a pop-cultural institution? For this of course is the second answer to why avant-garde irony and rebellion have become dilute and malign. They have been absorbed, emptied, and redeployed by the very televisual establishment they had originally set themselves athwart.

  Not that television is culpable for any evil here. Just for immoderate success. This is, after all, what TV does: it discerns, decocts, and re-presents what it thinks U.S. culture wants to see and hear about itself. No one and everyone is at fault for the fact that television started gleaning rebellion and cynicism as the hip upscale Baby-Boomer imago populi. But the harvest has been dark: the forms of our best rebellious art have become mere gestures, schticks, not only sterile but perversely enslaving. How can even the idea of rebellion against corporate culture stay meaningful when Chrysler Inc. advertises trucks by invoking “The Dodge Rebellion”? How is one to be a bona fide iconoclast when Burger King sells onion rings with “Sometimes You Gotta Break the Rules”? How can an Image-Fiction writer hope to make people more critical of televisual culture by parodying television as a self-serving commercial enterprise when Pepsi and Subaru and FedEx parodies of self-serving commercials are already doing big business? It’s almost a history lesson: I’m starting to see just why turn-of-the-last-century Americans’ biggest fear was of anarchists and anarchy. For if anarchy actually wins, if rulelessness become the rule, then protest and change become not just impossible but incoherent. It’d be like casting a ballot for Stalin: you are voting for an end to all voting.

  So here’s the stumper for the U.S. writer who both breathes our cultural atmosphere and sees himself heir to whatever was neat and valuable in avant-garde literature: how to rebel against TV’s aesthetic of rebellion, how to snap readers awake to the fact that our televisual culture has become a cynical, narcissistic, essentially empty phenomenon, when television regularly celebrates just these features in itself and its viewers? These are the very questions DeLillo’s poor schmuck of a popologist was asking back in ’85 about America, that most photographed of barns:

  “What was the barn like before it was photographed?” he said. “What did it look like, how was it different from other barns, how was it similar to other barns? We can’t answer these questions because we’ve read the signs, seen the people snapping the pictures. We can’t get outside the aura. We’re part of the aura. We’re here, we’re now.”

  He seemed immensely pleased by this.33

  end of the end of the line

  What responses to television’s commercialization of the modes of literary protest seem possible, then, today? One obvious option is for the fiction writer to become reactionary, fundamentalist. Declare contemporary television evil and contemporary culture evil and turn one’s back on the whole spandexed mess and invoke instead good old pre-1960s Hugh Beaumontish virtues and literal readings of the Testaments and be pro-Life, anti-Fluoride, antediluvian. The problem with this is that Americans who’ve opted for this tack seem to have one eyebrow straight across their forehead and knuckles that drag on the ground and really tall hair and in general just seem like an excellent crowd to want to transcend. Besides, the rise of Reagan/Bush/Gingrich showed that hypocritical nostalgia for a kinder, gentler, more Christian pseudo-past is no less susceptible to manipulation in the interests of corporate commercialism and PR image. Most of us will still take nihilism over neanderthalism.

  Another option would be to adopt a somewhat more enlightened political conservatism that exempts viewer and networks alike from any complicity in the bitter stasis of televisual culture and which instead blames all TV-related problems on certain correctable defects in technology. Enter media futurologist George Gilder, a Hudson Institute senior fellow and author of Life After Television: The Coming Transformation of Media and American Life. The single most fascinating thing about Life After Television is that it’s a book with commercials. Published in something called The Larger Agenda Series by one “Whittle Direct Books” in Federal Express Inc.’s Knoxville headquarters, the book sells for only $11.00 hard including postage, is big and thin enough to look great on executive coffee tables, and has very pretty full-page ads for Federal Express on every fifth page. The book’s also largely a work of fiction, plus it’s a heartrending dramatization of why anti-TV conservatives, motivated by simple convictions like “Television is at heart a totalitarian medium” whose “system is an alien and corrosive force in democratic capitalism,” are going to be of little help with our ultraradical-TV problems, attached as conservative intellectuals are to their twin tired remedies for all U.S. ills, viz. the beliefs that (1) the discerning consumer-instincts of the Little Guy will correct all imbalances if only Big Systems will quit stifling his Freedom to Choose, and that (2) technology-bred problems can be resolved technologically.

  Gilder’s basic diagnosis runs thus. Television as we know and suffer it is “a technology with supreme powers but deadly flaws.” The really fatal flaw is that the whole structure of television programming, broadcasting, and reception is still informed by the technological limitations of the old vacuum tubes that first enabled TV. The

  expense and complexity of these tubes used in television sets meant that most of the processing of signals would have to be done at the [networks],

  a state of affairs which

  dictated that television would be a top-down system—in electronic terms, a “master-slave” architecture. A few broadcasting centers would originate programs for millions of passive receivers, or “dumb terminals.”

  By the time the transistor (which does essentially what va
cuum tubes do but in less space at lower cost) found commercial applications, the top-down TV system was already entrenched and petrified, dooming viewers to docile reception of programs they were dependent on a very few networks to provide, and creating a “psychology of the masses” in which a trio of programming alternatives aimed to appeal to millions and millions of Joe B.’s. The TV signals are analog waves. Analogs are the required medium, since “With little storage or processing available at the set, the signals… would have to be directly displayable waves,” and “analog waves directly simulate sound, brightness, and color.” But analog waves can’t be saved or edited by their recipient. They’re too much like life: there in gorgeous toto one instant and then gone. What the poor TV viewer gets is only what he sees. This state of affairs has cultural consequences Gilder describes in apocalyptic detail. Even “High Definition Television” (HDTV), touted by the industry as the next big advance in entertainment, will, according to Gilder, be just the same vacuous emperor in a snazzier suit.

  But for Gilder, TV, still clinging to the crowd-binding and hierarchical technologies of yesterdecade, is now doomed by the advances in microchip and fiber-optic technology of the last few years. The user-friendly microchip, which consolidates the activities of millions of transistors on one 49¢ wafer, and whose capacities will get even more attractive as controlled electron-conduction approaches the geodesic paradigm of efficiency, will allow receivers—TV sets—to do much of the image-processing that has hitherto been done “for” the viewer by the broadcaster. In another happy development, transporting images through glass fibers rather than via the EM spectrum will allow people’s TV sets to be hooked up with each other in a kind of interactive net instead of all feeding passively at the transmitting teat of a single broadcaster. And fiber-optic transmissions have the further advantage that they conduct characters of information digitally. Since, as Gilder explains, “digital signals have an advantage over analog signals in that they can be stored and manipulated without deterioration” as well as being crisp and interferenceless as quality CDs, they’ll allow the microchipped television receiver (and thus the viewer) to enjoy much of the discretion over selection, manipulation, and recombination of video images that is today restricted to the director’s booth.

 

‹ Prev