The Death of Truth
Page 4
* * *
—
Postmodernism not only rejected all meta-narratives but also emphasized the instability of language. One of postmodernism’s founding fathers, Jacques Derrida—who would achieve celebrity status on American campuses in the 1970s and 1980s thanks in large part to such disciples as Paul de Man and J. Hillis Miller—used the word “deconstruction” to describe the sort of textual analysis he pioneered that would be applied not just to literature but to history, architecture, and the social sciences as well.
Deconstruction posited that all texts are unstable and irreducibly complex and that ever variable meanings are imputed by readers and observers. In focusing on the possible contradictions and ambiguities of a text (and articulating such arguments in deliberately tangled and pretentious prose), it promulgated an extreme relativism that was ultimately nihilistic in its implications: anything could mean anything; an author’s intent did not matter, could not in fact be discerned; there was no such thing as an obvious or commonsense reading, because everything had an infinitude of meanings. In short, there was no such thing as truth.
As David Lehman recounted in his astute book Signs of the Times, the worst suspicions of critics of deconstruction were confirmed when the Paul de Man scandal exploded in 1987 and deconstructionist rationales were advanced to defend the indefensible.
De Man, a professor at Yale and one of deconstruction’s brightest stars, had achieved an almost cultlike following in academic circles. Students and colleagues described him as a brilliant, charismatic, and charming scholar who had fled Nazi Europe, where, he implied, he had been a member of the Belgian Resistance. A very different portrait would emerge from Evelyn Barish’s biography The Double Life of Paul de Man: an unrepentant con man—an opportunist, bigamist, and toxic narcissist who’d been convicted in Belgium of fraud, forgery, and falsifying records.
The most shocking news had been revealed in 1987, four years after his death: a young Belgian researcher discovered that de Man had written at least one hundred articles for a pro-Nazi Belgian publication, Le Soir, during World War II—a publication that espoused a virulent anti-Semitism, declaring in one editorial that “we are determined to forbid ourselves any cross-breeding with them and to liberate ourselves spiritually from their demoralizing influence in the realm of thought, literature, and the arts.”
In the most notorious of his Le Soir articles, de Man argued that “Jewish writers have always remained in the second rank” and had therefore failed to exercise “a preponderant influence” on the evolution of contemporary European civilization. “One can thus see,” he wrote, “that a solution to the Jewish problem that would lead to the creation of a Jewish colony isolated from Europe would not have, for the literary life of the West, regrettable consequences. It would lose, in all, some personalities of mediocre worth and would continue, as in the past, to develop according to its higher laws of evolution.”
As news of de Man’s alarming collaborationist writings swept through academia, some scholars wondered if de Man’s shameful and secret past had informed his theories about deconstruction—for instance, his contention that “considerations of the actual and historical existence of writers are a waste of time.”
More disturbing still were efforts by some of de Man’s defenders, like Derrida, to use the principles of deconstruction to try to explain away de Man’s anti-Semitic writings, suggesting that his words actually subverted what they appeared to say or that there was too much ambiguity inherent in his words to assign moral responsibility.
One de Man admirer, cited by Lehman, tried to argue that de Man’s remarks about Jewish writers were a case of “irony” misfiring, contending that the essay’s tone was “one of detached mockery throughout the sections dealing with the Jews, and the object of the mockery is clearly not the Jews but rather the anti-Semites.” In other words, the writer was suggesting that de Man had meant the very opposite of what his Le Soir columns stated.
Though deconstructionists are fond of employing jargon-filled prose and perversely acrobatic syntax, some of the terms they use—like the “indeterminacy of texts,” “alternative ways of knowing,” and the “linguistic instability” of language—feel like pretentious versions of phrases recently used by Trump aides to explain away his lies, flip-flops, and bad-faith promises. For instance: a Trump representative telling an adviser to the Japanese prime minister, Shinzo Abe, that they didn’t “have to take each word that Mr. Trump said publicly literally”; and a former campaign manager, Corey Lewandowski, asserting that the problem with the media is “You guys took everything Donald Trump said so literally. The American people didn’t.”
3
“MOI” AND THE RISE OF SUBJECTIVITY
Our subjectivity is so completely our own.
—SPIKE JONZE
Parallel with academia’s embrace of postmodernism was the blossoming in the 1970s of what Christopher Lasch called “the culture of narcissism” and what Tom Wolfe memorably termed the “Me Decade”—a tidal wave of navel-gazing, self-gratification, and attention craving that these two authors attributed to very different causes.
Lasch saw narcissism as a defensive reaction to social change and instability—looking out for number one in a hostile, threatening world. In his 1979 book, The Culture of Narcissism, he argued that a cynical “ethic of self-preservation and psychic survival” had come to afflict America—a symptom of a country grappling with defeat in Vietnam, a growing mood of pessimism, a mass media culture centered on celebrity and fame, and centrifugal forces that were shrinking the role families played in the transmission of culture.
The narcissistic patient who had become increasingly emblematic of this self-absorbed age, Lasch wrote, often experienced “intense feelings of rage,” “a sense of inner emptiness,” “fantasies of omnipotence and a strong belief in [his] right to exploit others”; such a patient may be “chaotic and impulse-ridden,” “ravenous for admiration but contemptuous of those he manipulates into providing it,” and inclined to conform “to social rules more out of fear of punishment than from a sense of guilt.”
In contrast to Lasch, Tom Wolfe saw the explosion of “Me…Me…Me” in the 1970s as an altogether happier, more hedonistic development—an act of class liberation, powered by the postwar economic boom, which had left the working and middle classes with the leisure time and disposable income to pursue the sorts of vain activities once confined to aristocrats—the “remaking, remodeling, elevating, and polishing” of one’s own glorious self.
Economic times would grow considerably darker in the twenty-first century, but the self-absorption that Wolfe and Lasch described would remain a lasting feature of Western life, from the “Me Decade” of the 1970s on through the “selfie” age of Kim and Kanye. Social media would further accelerate the ascendance of what the Columbia Law School professor Tim Wu described as “the preening self” and the urge to “capture the attention of others with the spectacle of one’s self.”
With this embrace of subjectivity came the diminution of objective truth: the celebration of opinion over knowledge, feelings over facts—a development that both reflected and helped foster the rise of Trump.
Three examples. Number 1: Trump, who has been accused of greatly inflating his wealth, was asked about his net worth in a 2007 court deposition. His answer, it depends: “My net worth fluctuates, and it goes up and down with markets and with attitudes and with feelings, even my own feelings.” He added that it varied depending on his “general attitude at the time that the question may be asked.”
Number 2: Asked whether he’d questioned Vladimir Putin about Russian interference in the election, Trump replied, “I believe that he feels that he and Russia did not meddle in the election.”
Number 3: During the Republican National Convention in 2016, the CNN anchor Alisyn Camerota asked Newt Gingrich about Trump’s dark, nativist law-and-order spe
ech, which inaccurately depicted America as a country beset by violence and crime, and she was sharply rebutted by the former Speaker of the House. “I understand your view,” Gingrich said. “The current view is that liberals have a whole set of statistics which theoretically may be right, but it’s not where human beings are. People are frightened. People feel that their government has abandoned them.”
Camerota pointed out that the crime statistics weren’t liberal numbers; they came from the FBI.
The following exchange took place:
GINGRICH: No, but what I said is equally true. People feel it.
CAMEROTA: They feel it, yes, but the facts don’t support it.
GINGRICH: As a political candidate, I’ll go with how people feel and I’ll let you go with the theoreticians.
* * *
—
The tendency of Americans to focus, myopically, on their self-pursuits—sometimes to the neglect of their civic responsibilities—is not exactly new. In Democracy in America, written more than a century and a half before people started using Facebook and Instagram to post selfies and the internet was sorting us into silos of like-minded souls, Alexis de Tocqueville noted Americans’ tendency to withdraw into “small private societies, united together by similitude of conditions, habits, and customs,” in order “to indulge themselves in the enjoyments of private life.” He worried that this self-absorption would diminish a sense of duty to the larger community, opening the way for a kind of soft despotism on the part of the nation’s rulers—power that does not tyrannize, but “compresses, enervates, extinguishes, and stupefies a people” to the point where they are “reduced to nothing better than a flock of timid and industrious animals, of which the government is the shepherd.” This was one possible cost of a materialistic society, he predicted, where people become so focused on procuring “the petty and paltry pleasures with which they glut their lives” that they neglect their responsibilities as citizens; it was difficult to conceive, he wrote, how such people who “have entirely given up the habit of self-government should succeed in making a proper choice of those by whom they are to be governed.”
In the mid-twentieth century, the pursuit of self-fulfillment exploded within both the counterculture and the establishment. Predating Esalen and EST and the encounter groups that attracted hippies and New Age seekers intent on expanding their consciousness in the 1960s and 1970s were two influential figures whose doctrines of self-realization were more materialistic and more attractive to politicians and suburban Rotarians. Norman Vincent Peale, the author of the 1952 self-help bestseller The Power of Positive Thinking—known as “God’s salesman” for his hawking of the prosperity gospel—was admired by Trump’s father, Fred, and the younger Trump would internalize the celebrity pastor’s teachings on self-fulfillment and the power of the mind to create its own reality. “Any fact facing us, however difficult, even seemingly hopeless, is not so important as our attitude toward that fact,” Peale wrote, seeming to promote the doctrine of denial along with the doctrine of success. “A confident and optimistic thought pattern can modify or overcome the fact altogether.”
Ayn Rand, also admired by Trump (over the years, The Fountainhead is one of the few novels he’s cited as a favorite), won the fealty of several generations of politicians (including Paul Ryan, Rand Paul, Ron Paul, and Clarence Thomas) with her transactional view of the world, her equation of success and virtue, and her proud embrace of unfettered capitalism. Her argument that selfishness is a moral imperative, that man’s “highest moral purpose” is “the pursuit of his own happiness,” would resonate with Trump’s own zero-sum view of the world and his untrammeled narcissism.
* * *
—
As the West lurched through the cultural upheavals of the 1960s and 1970s and their aftermath, artists struggled with how to depict this fragmenting reality. Some writers like John Barth, Donald Barthelme, and William Gass created self-conscious, postmodernist fictions that put more emphasis on form and language than on conventional storytelling. Others adopted a minimalistic approach, writing pared-down, narrowly focused stories emulating the fierce concision of Raymond Carver. And as the pursuit of broader truths became more and more unfashionable in academia, and as daily life came to feel increasingly unmoored, some writers chose to focus on the smallest, most personal truths: they wrote about themselves.
American reality had become so confounding, Philip Roth wrote in a 1961 essay (1961!), that it felt like “a kind of embarrassment to one’s own meager imagination.” This had resulted, he wrote, in the “voluntary withdrawal of interest by the writer of fiction from some of the grander social and political phenomena of our times,” and the retreat, in his own case, to the more knowable world of the self.
In a controversial 1989 essay, Tom Wolfe lamented these developments, mourning what he saw as the demise of old-fashioned realism in American fiction, and he urged novelists to “head out into this wild, bizarre, unpredictable, Hog-stomping Baroque country of ours and reclaim it as literary property.” He tried this himself in novels like The Bonfire of the Vanities and A Man in Full, using his skills as a reporter to help flesh out a spectrum of subcultures with Balzacian detail. But while Wolfe had been an influential advocate in the 1970s of the New Journalism (which put a new emphasis on the voice and point of view of the reporter), his new manifesto didn’t win that many converts in the literary world. Instead, writers as disparate as Louise Erdrich, David Mitchell, Don DeLillo, Julian Barnes, Chuck Palahniuk, Gillian Flynn, and Lauren Groff would play with devices (like multiple points of view, unreliable narrators, and intertwining story lines) pioneered decades ago by innovators like Faulkner, Woolf, Ford Madox Ford, and Nabokov to try to capture the new Rashomon-like reality in which subjectivity rules and, in the infamous words of former president Bill Clinton, truth “depends on what the meaning of the word ‘is’ is.”
But what Roth called “the sheer fact of self, the vision of self as inviolable, powerful, and nervy, self as the only real thing in an unreal environment,” would remain more comfortable territory for many writers. In fact, it would lead, at the turn of the millennium, to a remarkable flowering of memoir writing, including such classics as Mary Karr’s The Liars’ Club and Dave Eggers’s A Heartbreaking Work of Staggering Genius—works that established their authors as among the foremost voices of their generation.
The memoir boom and the popularity of blogging at the turn of the millennium would eventually culminate in Karl Ove Knausgaard’s six-volume autobiographical novel—filled with minutely detailed descriptions, drawn from the author’s own daily life. Along the way, there were also a lot of self-indulgent, self-dramatizing works by other authors that would have been better left in writers’ private journals or social media accounts. The reductio ad absurdum of this navel-gazing was James Frey’s bestselling book A Million Little Pieces, which was sold as a memoir but which the Smoking Gun website reported in January 2006 contained “wholly fabricated or wildly embellished details of his purported criminal career, jail terms and status as an outlaw ‘wanted in three states.’ ” Frey, who seems to have engaged in this act of self-dramatization to make himself out to be a more notorious figure than he actually was (presumably so his subsequent “redemption” would be all the more impressive as an archetypal tale of recovery), later conceded that “most of what” the Smoking Gun site reported “was pretty accurate.” For some readers, angry that they had been sold a false bill of goods, Frey’s book was a con job, a repudiation of the very qualities—honesty, authenticity, candor—that memoirs are supposed to embody, but other readers shrugged off the differentiation between fact and fiction: their response a symptom of just how comfortable people had become with the blurred lines of truth.
* * *
—
Personal testimony also became fashionable on college campuses, as the concept of objective truth fell out of favor and empirical
evidence gathered by traditional research came to be regarded with suspicion. Academic writers began prefacing scholarly papers with disquisitions on their own “positioning”—their race, religion, gender, background, personal experiences that might inform or skew or ratify their analysis. Some proponents of the new “moi criticism” began writing full-fledged academic autobiographies, Adam Begley reported in Lingua Franca in 1994, noting that the trend toward autobiography traced back to the 1960s, to early feminist consciousness-raising groups, and that it often “spread in tandem with multiculturalism: News about minority experience often comes packaged in the first person singular. Ditto for gay studies and queer theory.”
In her 1996 book, Dedication to Hunger: The Anorexic Aesthetic in Modern Culture, the scholar Leslie Heywood used events from her own life (like her own anorexia and a humiliating relationship with a married man) to draw analogies between anorexia and modernism, an approach that had the effect of reducing great masterpieces like T. S. Eliot’s The Waste Land into case studies in an anti-women, anti-fat aesthetic.
Personal stories or agendas started turning up in biographies, too. No longer were biographies simple chronicles of other people’s lives. Instead, they became platforms for philosophical manifestos (Norman Mailer’s Portrait of Picasso as a Young Man), feminist polemics (Francine du Plessix Gray’s Rage and Fire, a portrait of Flaubert’s mistress Louise Colet), and deconstructionist exercises (S. Paige Baty’s American Monroe: The Making of a Body Politic).