The Conservative Sensibility

Home > Other > The Conservative Sensibility > Page 46
The Conservative Sensibility Page 46

by George F. Will


  New technologies have exacerbated certain disturbances in the intellectual atmosphere. In 2005, Lynne Truss presciently warned that we were slouching into “an age of social autism” with a “Universal Eff-Off Reflex.” Before video streaming brought us binge-watching, she foresaw people entertaining themselves into inanition with portable technologies that enable “limitless self-absorption” and make people solipsistic and unmannerly. Truss foresaw an age of “hair-trigger sensitivity” and “lazy moral relativism combined with aggressive social insolence.”46 This was twelve years before some Wellesley College professors said that inviting controversial, meaning conservative, speakers to campus injures students by forcing them to “invest time and energy in rebutting the speakers’ arguments.”47

  The consequences of growing intolerance are enumerated by Tom Nichols, who says that our devices and social media are producing people who confuse “Internet grazing” with research and equate this faux research with higher education. Today, students demand to run institutions that the students insist should treat them as fragile children. “It is,” Nichols writes, “a new Declaration of Independence: no longer do we hold these truths to be self-evident, we hold all truths to be self-evident, even the ones that aren’t true. All things are knowable and every opinion on any subject is as good as any other.” In today’s therapeutic culture, which seems designed to validate every opinion and feeling, there will rarely be disagreement without anger between thin-skinned people who cannot distinguish the phrase “you’re wrong” from “you’re stupid.” Equating “critical thinking” with “relentless criticism” results in worse than the indiscriminate rejection of this or that actual expert. This equation produces what Nichols calls “a Google-fueled, Wikipedia-based, blog-sodden” disdain for even the ideal of expertise. This ideal is an affront in a culture that “cannot endure even the slightest hint of inequality of any kind.” Unfortunately, Nichols notes, “specialization is necessarily exclusive.” And aren’t we glad: “When you take an elevator to the top floor of a tall building, the certificate in the elevator does not say ‘good luck up there’; it says that a civic authority, relying on engineers educated and examined by other engineers, have looked at that box and know, with as much certainty as anyone can, that you’ll be safe.”48

  The “spreading epidemic of misinformation” gives rise to a corollary to Gresham’s Law (“bad money drives out good”): “Misinformation pushes aside knowledge.” Everyone with a smartphone has in his or her pocket, Nichols says, more information “than ever existed in the entire Library of Alexandria.” This can, however, produce a self-deluding veneer of erudition and a sense of cheap success. Nichols recounts an old joke about a British Foreign Office official who retired after forty years: “Every morning I went to the Prime Minister and assured him there would be no world war today. And I am pleased to note that in a career of 40 years, I was only wrong twice.”49 This official deserved an A grade, like everyone else.

  It would help if people would put their electronic devices away from the center of their existences and pick up a book. Johannes Gutenberg’s invention of printing with moveable type was the necessary precursor of mass literacy, and hence was, as Senator Ben Sasse says, “arguably the most radical leveling event in history.”50 Democratizing access to information was a necessary solvent of hierarchies based on privileged access to knowledge, and a necessary precondition for social mobility. There have been times when reading was regarded with suspicion. Some among the ancient Greeks regarded the rise of reading as cultural decline: They considered oral dialogue, which involves the constant posing of clarifying questions, more conducive to truth. But the transition from an oral to a print culture has generally been a transition from a tribal society to a society of self-consciously separated individuals. In Europe, that transition alarmed ruling elites, for whom the “crisis of literacy” was the fact that there was too much literacy: Readers had, inconveniently, minds of their own. Reading is inherently private; hence the reader is potentially beyond state supervision or crowd psychology. This suggests why there are perils in the transition from a print to an electronic culture. Time was, books were the primary means of knowing things. Now many people learn most things visually, from the graphic presentation of immediately, effortlessly accessible pictures. People grow accustomed to the narcotic effect of their own passive reception of today’s sensory blitzkrieg of surfaces. They recoil from the more demanding nature of active engagement with books—with the nuances encoded in the limitless permutations of alphabetic signs on pages. Besides, reading requires two things that are increasingly scarce, and to which increasing numbers of Americans seem allergic: solitude and silence.

  “BUT IF NOT”

  Reading books that are part of a community’s shared experience is one way of assuring the reality of a community. In 1872, Theodore Roosevelt, Sr., took his family to Europe for an extended tour, during which his son Teddy did what fourteen-year-olds often do: He had a growth spurt. His parents had to buy him a new suit because his wrists and ankles protruded comically from the suit he had brought from New York. He and his family called his outgrown clothes “my ‘Smike suit,’ because it left my wrists and ankles as bare as those of poor Smike himself.”51 Smike was one of the forty urchins badly treated by Wackford Squeers at Dotheboys Hall, the Yorkshire school that Charles Dickens described so darkly in Nicholas Nickleby. For fourteen-year-old Teddy and his family, this secondary character from a Dickens novel—a novel published thirty-three years before the Roosevelts’ European tour—was familiar enough to provide cultural references. Smike was part of the shared vocabulary, the casual discourse of this family.

  In 1910, Teddy Roosevelt had for two years been a former president, and was not happy. The political mores of that day required would-be presidential candidates to manifest diffidence, if not reluctance, about seeking office. During a trip to Boston, TR stayed at the home of a supporter, and when journalists asked the supporter if TR would be a candidate in 1912, he answered simply, “Barkis is willin’.” Barkis was the stagecoach driver in David Copperfield, who relentlessly courted Clara Peggotty, Copperfield’s childhood nurse. Dickens’ readers remembered Barkis for his reiteration of the phrase “Barkis is willin’.” Eleven decades later, we are unlikely to encounter the easy, unaffected insertion of such literary references into conversations.52

  In June 1940, a British officer in the desperate circumstances of Dunkirk beach flashed to London a three-word message: “But if not.” What meaning, if any, would we today find in such an opaque—to us—message? Far from seeming opaque in 1940, it was instantly recognized, as its sender assumed it would be, as a Biblical quotation. It is from the Book of Daniel, from the passage in which Nebuchadnezzar commands Shadrach, Meshach, and Abednego to either worship the golden image or be thrust into the fiery furnace. The three threatened men respond defiantly: “Our God whom we serve is able to deliver us from the burning fiery furnace, and he will deliver us out of thine hand, O king. But if not, be it known unto thee, O king, that we will not serve thy gods, nor worship the golden image…”53 A British officer, with his back to the English Channel and his face to the Wehrmacht, expressed heroic defiance with elegant economy. He distilled his situation and his moral stance into three one-syllable words. In the cacophony of war, in the deadly confusion of an evacuation under attack, he deftly plucked from the then-common culture an almost universally familiar fragment of a passage from a book. With the fragment he connected himself, and his interlocutors, with a resonant story from the Western canon.

  Today, the very few Americans who know about it probably look back upon the “five-foot shelf” phenomenon with bemused condescension. Charles W. Eliot, who had been president of Harvard for forty years when he retired in 1909, once told a group of working men that although not everyone could go to Harvard, anyone could read like “a Harvard man.” What was required, Eliot said, was a five-foot shelf of those books that define our common culture. P. F. Collier and Son
publishing company obliged by publishing fifty-one volumes as Harvard Classics, selling 350,000 sets in twenty years.54 Today, it would be rash to assume that Harvard itself acquaints its students with such a canon.

  The shared stock of literary and historical knowledge is not as plentiful as it used to be. When in 1840 and 1841 Dickens was publishing The Old Curiosity Shop serially in newspapers, some of his ardent readers in New York went to the docks when transatlantic ships arrived with English newspapers, anxiously shouting up to the crew members on deck, “Did little Nell die?” Times and sensibilities change, and years later Oscar Wilde said that anyone who could read of the death of Little Nell without laughing must have a heart of stone. But without regretting the passing of the hunger for Dickensian melodrama, one should very much regret the passing of the passionate reading public.

  It is simply impossible to imagine a book—any book on any subject—having the impact that Harriet Beecher Stowe’s Uncle Tom’s Cabin had when published in 1852. This novel sold 300,000 copies in the United States in the first year of its publication. Relative to population, this is comparable to selling four million copies in a year today. And the literate portion of the population was much smaller then. Within a decade Stowe’s novel sold more than two million copies in the United States. It is to this day the best-selling book of all time in proportion to population. Lincoln may or may not have said to Stowe when she visited the White House, “So you’re the little woman who started this big war.”55 But because opinion drives events in a democracy, her book was indeed a precipitant of the war. She put pen to paper and changed the world.

  Karl Marx, her contemporary, might have had trouble shoehorning her achievement into his philosophy of history. His grave in London’s Highgate Cemetery bears these of his words: “The philosophers have only interpreted the world, in various ways. The point, however, is to change it.”56 Marx misinterpreted the world in various ways, and changed it for the worse, because interpretations have consequences. Marx devalued humanity’s intellectual history by arguing that human consciousness is not just conditioned by, it is controlled by, a society’s system of production. As H. Stuart Hughes observed, Marx and Marxists became fond of “obstetrical metaphors” about the present being pregnant with the future, which must be midwived by this or that force or political party. Such biological categories encouraged thinking about social change as a non-volitional process with a progressive inner logic. Soon after Marx, Freud stressed the irrational, or non-rational, instinctive side of life. Marx focused on the limited freedom of human consciousness circumscribed by social arrangements. Freud saw freedom limited by inner “drives.” Marx, Freud, and other intellectuals were convinced that they had pierced the veil of appearances, unmasked illusions, and found a substratum that supposedly is the real basis of things, thereby solving the problem of observing and analyzing society.57

  There was, however, an anti-intellectual aspect to what such thinkers were thinking. They all said, in one way or another, that ideas were derivative from, or reflections of, other things. In one way or another, this made the idea of human agency problematic, which in turn complicated the study of history and the valuing of self-government. The motivated individual is supposedly the crux of historical study. But in what sense is the individual the master of his motives, and therefore really a history-maker? If Marx was right, classes, not individuals, are history’s motors. And when Darwinian categories infused social thinking, the idea of consciousness became entangled with the ideas of environmental and hereditary causation. One result was “scientific fatalism,” a retreat not necessarily from the idea of progress, but from the seventeenth- and eighteenth-century Enlightenment idea that progress could be produced by largely uncircumscribed choices, consciously made for clear motives on the basis of objectively verifiable social ideas.58 The Enlightenment idea was that humanity could understand and manipulate the social world because humanity made it. This idea was challenged by a new intellectual field, the sociology of knowledge. Interest in the social conditioning of thought—the thought of individuals, classes, or entire societies—was a consequence of the new insistence on the historicity of everything.

  Humanity has been plied and belabored by various historicisms purporting to prove what has happened had to happen, that history is a dry story of the ineluctable working of vast impersonal forces unfolding according to iron laws of social evolution. In his preface to Kapital, Karl Marx said that “the economic law of motion in modern society”—the “natural laws of capitalist production”—are “working with iron necessity towards inevitable results.”59 This theory of history was of a piece with his theory that human nature was not a permanent essence (wesen) but something constantly created by the unfolding of teleological History in accordance with those “iron” laws. If Marx was right, the task of understanding the past becomes difficult, perhaps insuperably so. Besides, why bother? If the ancient Greeks and Romans, or the people of the Middle Ages, had natures that were not natural, but were “reflections” of the relations of classes to their social systems, understanding them is not pertinent to our situation. If there is no human nature and therefore no constant categories—if it is not consciousness that determines existence, but existence that determines consciousness—then the understanding of previous eras is difficult, moral judgments about them are pointless, and they are not relevant to the ongoing human story.

  When such thinking spreads, so does the danger that life will be swallowed by the politics of consciousness. If people are taught that they are mere corks bobbing on a tide of irresistible causality, they will be tempted by passivity and tormented by the fact that their consciousness is not really theirs. Surely there is a connection between the various theories about human agency being attenuated or chimerical and the emergence of a cowering, timid generation of students embracing a cult of fragility and demanding to be made “safe” from almost everything. Furthermore, to deny the autonomy of culture, explaining it as an “epiphenomenon,” a “reflection” of other forces, is to drain culture of dignity. The reduction of the study of literature to sociology, and of sociology to mere ideological assertion, has a central tenet: Writers are captives of the conditioning of their class, sex, race. All literature on which canonical status is conferred merely represents the disguised or unexamined assumptions and interests of the dominant class, sex, race. Critics armed with what has been called the “hermeneutics of suspicion” radically devalue authors and elevate the ideologists—the critics themselves—as indispensable decoders of literature, all of which is, by definition, irreducibly political.

  Shakespeare’s Tempest reflects the imperialist rape of the Third World. Emily Dickinson’s poetic references to peas and flower buds are encoded messages of feminist rage, exulting clitoral masturbation to protest the prison of patriarchal sex roles. Jane Austen’s supposed serenity masks boiling fury about male domination, expressed in the nastiness of minor characters who are “really” not minor. In Wuthering Heights, Emily Brontë, a subtle subversive, has Catherine bitten by a male bulldog. The supplanting of aesthetic by political responses to literature makes literature primarily interesting as a mere index of who had power and whom the powerful victimized. The left’s agenda liberates literature from aesthetic standards, all of which are defined as merely sublimated assertions of power by society’s dominant group. It follows that all critics and authors from particular victim groups should be held only to the political standards of their group. Administration of these standards, and of the resulting racial and sexual spoils system in the academy, “requires” group politics: Under the spreading chestnut tree, I tenure you and you tenure me.

  As aesthetic judgments are politicized, political judgments are aestheticized: The striking of poses and the enjoyment of catharsis are central in the theater of victimization in academic life. All of this, although infantile, is not trivial. By “deconstructing,” or politically decoding, or otherwise attacking the meaning of literary works, critics strip liter
ature of its authority. Criticism displaces literature and critics displace authors as bestowers of meaning. In the writing of history, too, there is a stilted style that makes the writer the center of attention. It does so by—to use a verb much favored in academia—privileging a strange vocabulary. To select at random just one from uncountable possible examples, when an American historian says that Franco’s Spain was “as savagely hierarchizing” as Hitler’s Germany, she is writing in a manner that no one outside the academy writes. When, in her ostentatiously mannered way, she says that Catholicism “problematized” the ideological rapprochement between Francoism and Nazism, or when she says that liberalism, constitutionalism, and democracy are concepts that must be “interrogated” in specific contexts, her vocabulary aggressively signals her membership in a closed clerisy with its private argot.60 If one believes that all literature, properly “contextualized,” is to be seen through the lens of politics; if one believes that because history is opaque, it should be discussed in opaque jargon; if one believes that things that are judged to be true, good, and beautiful are, anytime and everywhere, judged by standards that are mere matters of opinion and beyond rational defense; if one believes these things, they have consequences for our understanding of what education should, or can, be.

 

‹ Prev