Book Read Free

Hooking Up

Page 12

by Tom Wolfe


  Sorry, Fran, but it’s third and twenty-three and the genetic fix is in, and the new message is now being pumped out into the popular press and onto television at a stupefying rate. Who are the pumps? They are a new breed who call themselves “evolutionary psychologists.” You can be sure that twenty years ago the same people would have been calling themselves Freudian; but today they are genetic determinists, and the press has a voracious appetite for whatever they come up with.

  The most popular study currently—it is still being featured on television news shows—is David Lykken and Auke Tellegen’s study at the University of Minnesota of two thousand twins that shows, according to these two evolutionary psychologists, that an individual’s happiness is largely genetic. Some people are hardwired to be happy and some are not. Success (or failure) in matters of love, money, reputation, or power is transient stuff; you soon settle back down (or up) to the level of happiness you were born with genetically. Fortune devoted a long takeout, elaborately illustrated, of a study by evolutionary psychologists at Britain’s University of Saint Andrews showing that you judge the facial beauty or handsomeness of people you meet not by any social standards of the age you live in but by criteria hardwired in your brain from the moment you were born. Or, to put it another way, beauty is not in the eye of the beholder but embedded in his genes. In fact, today, in the year 2000, if your appetite for newspapers, magazines, and television is big enough, you will quickly get the impression that there is nothing in your life, including the fat content of your body, that is not genetically predetermined. If I may mention just a few things the evolutionary psychologists have illuminated for me recently:

  One widely publicized study found that women are attracted to rich or powerful men because they are genetically hardwired to sense that alpha males will be able to take better care of their offspring. So if her current husband catches her with somebody better than he is, she can say in all sincerity, “I’m just a lifeguard in the gene pool, honey.” Personally, I find that reassuring. I used to be a cynic. I thought the reason so many beautiful women married ugly rich men was that they were schemers, connivers, golddiggers. Another study found that the male of the human species is genetically hardwired to be polygamous, i.e., unfaithful to his legal mate, so that he will cast his seed as widely as humanly possible. Well … men can read, too! “Don’t blame me, honey. Four hundred thousand years of evolution made me do it.” Another study showed that most murders are the result of genetically hardwired compulsions. Well … convicts can read, too, and hoping for parole, they report to the prison psychiatrist: “Something came over me … and then the knife went in.”3 Another showed that teenage girls, being in the prime of their fecundity, are genetically hardwired to be promiscuous and are as helpless to stop themselves as minks or rabbits. Some public school systems haven’t had to be told twice. They provide not only condoms but also special elementary, junior high, and high schools where teenage mothers can park their offspring in nursery rooms while they learn to read print and do sums.

  Where does that leave “self-control”? In quotation marks, like many old-fashioned notions—once people believe that this ghost in the machine, “the self,” does not even exist and brain imaging proves it, once and for all.

  So far, neuroscientific theory is based largely on indirect evidence, from studies of animals or of how a normal brain changes when it is invaded (by accidents, disease, radical surgery, or experimental needles). Darwin II himself, Edward O. Wilson, has only limited direct knowledge of the human brain. He is a zoologist, not a neurologist, and his theories are extrapolations from the exhaustive work he has done in his specialty, the study of insects. The French surgeon Paul Broca discovered Broca’s area, one of the two speech centers of the left hemisphere of the brain, only after one of his patients suffered a stroke. Even the PET scan and the PET reporter gene/PET reporter probe are technically medical invasions, since they require the injection of chemicals or viruses into the body. But they offer glimpses of what the noninvasive imaging of the future will probably look like. A neuroradiologist can read a list of topics out loud to a person being given a PET scan, topics pertaining to sports, music, business, history, whatever, and when he finally hits one the person is interested in, a particular area of the cerebral cortex actually lights up on the screen. Eventually, as brain imaging is refined, the picture may become as clear and complete as those see-through exhibitions, at auto shows, of the inner workings of the internal combustion engine. At that point it may become obvious to everyone that all we are looking at is a piece of machinery, an analog chemical computer, that processes information from the environment. “All,” since you can look and look and you will not find any ghostly self inside, or any mind, or any soul.

  Thereupon, in the year 2010 or 2030, some new Nietzsche will step forward to announce: “The self is dead”—except that being prone to the poetic, like Nietzsche the First, he will probably say: “The soul is dead.” He will say that he is merely bringing the news, the news of the greatest event of the millennium: “The soul, that last refuge of values, is dead, because educated people no longer believe it exists.” Unless the assurances of the Wilsons and the Dennetts and the Dawkinses also start rippling out, the madhouse that will ensue may make the phrase “the total eclipse of all values” seem tame.

  If I were a college student today, I don’t think I could resist going into neuroscience. Here we have the two most fascinating riddles of the twenty-first century: the riddle of the human mind and the riddle of what happens to the human mind when it comes to know itself absolutely. In any case, we live in an age in which it is impossible and pointless to avert your eyes from the truth.

  Ironically, said Nietzsche, this unflinching eye for truth, this zest for skepticism, is the legacy of Christianity (for complicated reasons that needn’t detain us here). Then he added one final and perhaps ultimate piece of irony in a fragmentary passage in a notebook shortly before he lost his mind (to the late nineteenth century’s great venereal scourge, syphilis). He predicted that eventually modern science would turn its juggernaut of skepticism upon itself, question the validity of its own foundations, tear them apart, and self-destruct. I thought about that in the summer of 1994, when a group of mathematicians and computer scientists held a conference at the Santa Fe Institute on “Limits to Scientific Knowledge.” The consensus was that since the human mind is, after all, an entirely physical apparatus, a form of computer, the product of a particular genetic history, it is finite in its capabilities. Being finite, hardwired, it will probably never have the power to comprehend human existence in any complete way. It would be as if a group of dogs were to call a conference to try to understand The Dog. They could try as hard as they wanted, but they wouldn’t get very far. Dogs can communicate only about forty notions, all of them primitive, and they can’t record anything. The project would be doomed from the start. The human brain is far superior to the dog’s, but it is limited nonetheless. So any hope of human beings arriving at some final, complete, self-enclosed theory of human existence is doomed, too.

  This, science’s Ultimate Skepticism, has been spreading ever since then. Over the past two years even Darwinism, a sacred tenet among American scientists for the past seventy years, has been beset by … doubts. Scientists—not religiosi—notably the mathematician David Berlinski (“The Deniable Darwin,” Commentary, June 1996) and the biochemist Michael Behe (Darwin’s Black Box, 1996) have begun attacking Darwinism as a mere theory, not a scientific discovery, a theory woefully unsupported by fossil evidence and featuring, at the core of its logic, sheer mush. (Dennett and Dawkins, for whom Darwin is the Only Begotten, the Messiah, are already screaming. They’re beside themselves, utterly apoplectic. Wilson, the giant, keeping his cool, has remained above the battle.) Noam Chomsky has made things worse by pointing out that there is nothing even in the highest apes remotely comparable to human speech, which is in turn the basis of recorded memory and, therefore, everything from skyscrapers and mi
ssions to the moon to Islam and little matters such as the theory of evolution. He says it’s not that there is a missing link; there is nothing to link up with. By 1990 the physicist Petr Beckmann of the University of Colorado had already begun going after Einstein. He greatly admired Einstein for his famous equation of matter and energy, E=mc2, but called his theory of relativity mostly absurd and grotesquely untestable. Beckmann died in 1993. His Fool Killer’s cudgel has been taken up by Howard Hayden of the University of Connecticut, who has many admirers among the upcoming generation of Ultimately Skeptical young physicists. The scorn the new breed heaps upon quantum mechanics (“has no real-world applications” … “depends entirely on goofball equations”), Unified Field Theory (“Nobel worm bait”), and the Big Bang Theory (“creationism for nerds”) has become withering. If only Nietzsche were alive! He would have relished every minute of it!

  Recently I happened to be talking to a prominent California geologist, and she told me: “When I first went into geology, we all thought that in science you create a solid layer of findings, through experiment and careful investigation, and then you add a second layer, like a second layer of bricks, all very carefully, and so on. Occasionally some adventurous scientist stacks the bricks up in towers, and these towers turn out to be insubstantial and they get torn down, and you proceed again with the careful layers. But we now realize that the very first layers aren’t even resting on solid ground. They are balanced on bubbles, on concepts that are full of air, and those bubbles are being burst today, one after the other.”

  I suddenly had a picture of the entire astonishing edifice collapsing and modern man plunging headlong back into the primordial ooze. He’s floundering, sloshing about, gulping for air, frantically treading ooze, when he feels something huge and smooth swim beneath him and boost him up, like some almighty dolphin. He can’t see it, but he’s much impressed. He names it God.

  VITA ROBUSTA, ARS ANOREXICA

  In the Land of the Rococo Marxists

  Where was I? On the wrong page? The wrong channel? Outside the bandwidth? As building managers here in New York shut down the elevators at 11:30 p.m. on December 31, 1999, so that citizens would not be trapped between floors by Y2K microchip failures—and licensed pyrotechnicians launched EPA-sanctioned fireworks from cordoned-off Central Park “venues” at precisely 12:00:01 a.m., January 1, 2000, to mark the arrival of the twenty-first century and the third millennium—did a single solitary savant note that the First American Century had just come to an end and the Second American Century had begun?—and that there might well be five, six, eight more to come?—resulting in a Pax Americana lasting a thousand years? Or did I miss something?

  Did a single historian mention that America now dominates the world to an extent that would have made Julius Caesar twitch with envy?—would have made Alexander the Great, who thought there were no more worlds to conquer, get down on all fours and beat his fists on the ground in despair because he was merely a warrior and had never heard of international mergers and acquisitions, rock and rap, fireball movies, TV, the NBA, the World Wide Web, and the “globalization” game?

  Was a single bard bestirred to write a mighty anthem—along the lines of James Thomson’s “Rule, Britannia! Britannia rule the waves! Britons never, never, never shall be slaves!”—for America, the nation that in the century just concluded had vanquished two barbaric nationalistic brotherhoods, the German Nazis and the Russian Communists, two hordes of methodical slave-hunting predators who made the Huns and Magyars look whimsical by comparison? Or had the double A’s in my Discman died on me?

  Did anybody high or low look for a Frédéric-Auguste Bartholdi to create a new tribute on the order of the Statue of Liberty for the nation that in the twentieth century, even more so than in the nineteenth, opened her arms to people from all over the globe—to Vietnamese, Thais, Cambodians, Laotians, Hmong, Ethiopians, Albanians, Senegalese, Guyanese, Eritreans, Cubans, as well as everybody else—and made sure they enjoyed full civil rights, including the means to take political power in a city the size of Miami if they could muster the votes? Did anybody even wistfully envision such a monument to America the International Haven of Democracy? Or had my Flash Art subscription run out?

  Did any of the America-at-century’s-end network TV specials strike the exuberant note that Queen Victoria’s Diamond Jubilee struck in 1897? All I remember are voice-overs saying that for better or worse … hmm, hmm … McCarthyism, racism, Vietnam, right-wing militias, Oklahoma City, Heaven’s Gate, Doctor Death … on balance, hmm, we’re not entirely sure … for better or worse, America had won the Cold War … hmm, hmm, hmm …

  My impression was that one American Century rolled into another with all the pomp and circumstance of a mouse pad. America’s great triumph inspired all the patriotism and pride (or, if you’d rather, chauvinism), all the yearning for glory and empire (or, if you’d rather, the spirit of Manifest Destiny), all the martial jubilee music of a mouse click.

  Such was my impression; but it was only that, my impression. So I drew upon the University of Michigan’s fabled public-opinion survey resources. They sent me the results of four studies, each approaching the matter from a different angle. Chauvinism? The spirit of Manifest Destiny? According to one survey, 74 percent of Americans don’t want the United States to intervene abroad unless in cooperation with other nations, presumably so that we won’t get all the blame. Excitement? Americans have no strong feelings about their country’s supremacy one way or the other. They are lacking in affect, as the clinical psychologists say.

  There were seers who saw this coming even at the unabashedly pompous peak (June 22) of England’s 1897 Jubilee. One of them was Rudyard Kipling, the empire’s de facto poet laureate, who wrote a poem for the Jubilee, “Recessional,” warning: “Lo, all our pomp of yesterday /Is one with Nineveh and Tyre!” He and many others had the uneasy feeling that the foundations of European civilization were already shifting beneath their feet, a feeling indicated by the much used adjectival compound fin-de-siècle. Literally, of course, it meant nothing more than “end-of-the-century,” but it connoted something modern, baffling, and troubling in Europe. Both Nietzsche and Marx did their greatest work seeking to explain the mystery. Both used the term “decadence.”

  But if there was decadence, what was decaying? Religious faith and moral codes that had been in place since time was, said Nietzsche, who in 1882 made the most famous statement in modern philosophy—“God is dead”—and three startlingly accurate predictions for the twentieth century. He even estimated when they would begin to come true: about 1915. (1) The faith men formerly invested in God they would now invest in barbaric “brotherhoods with the aim of the robbery and exploitation of the non-brothers.” Their names turned out, in due course, to be the German Nazis and the Russian Communists. (2) There would be “wars such as have never been waged on earth.” Their names turned out to be World War I and World War II. (3) There no longer would be Truth but, rather, “truth” in quotation marks, depending upon which concoction of eternal verities the modern barbarian found most useful at any given moment. The result would be universal skepticism, cynicism, irony, and contempt. World War I began in 1914 and ended in 1918. On cue, as if Nietzsche were still alive to direct the drama, an entirely new figure, with an entirely new name, arose in Europe: that embodiment of skepticism, cynicism, irony, and contempt, the Intellectual.

  The word “intellectual,” used as a noun referring to the “intellectual laborer” who assumes a political stance, did not exist until Georges Clemenceau used it in 1898 during the Dreyfus case, congratulating those “intellectuals,” such as Marcel Proust and Anatole France, who had joined Dreyfus’s great champion, Emile Zola. Zola was an entirely new form of political eminence, a popular novelist. His famous J’accuse was published on the front page of a daily newspaper, L’Aurore (“The Dawn”), which printed 300,000 copies and hired hundreds of extra newsboys, who sold virtually every last one by midafternoon.

  Zol
a and Clemenceau provided a wholly unexpected leg up in life for the ordinary worker ants of “pure intellectual labor” (Clemenceau’s term): your fiction writers, playwrights, poets, history and lit profs, that whole cottage industry of poor souls who scribble, scribble, scribble. Zola was an extraordinary reporter (or “documenter,” as he called himself) who had devoured the details of the Dreyfus case to the point where he knew as much about it as any judge, prosecutor, or law clerk. But that inconvenient detail of Zola’s biography was soon forgotten. The new hero, the intellectual, didn’t need to burden himself with the irksome toil of reporting or research. For that matter, he needed no particular education, no scholarly training, no philosophical grounding, no conceptual frameworks, no knowledge of academic or scientific developments other than the sort of stuff you might pick up in Section 9 of the Sunday newspaper. Indignation about the powers that be and the bourgeois fools who did their bidding—that was all you needed. Bango! You were an intellectual.

  From the very outset the eminence of this new creature, the intellectual, who was to play such a tremendous role in the history of the twentieth century, was inseparable from his necessary indignation. It was his indignation that elevated him to a plateau of moral superiority. Once up there, he was in a position to look down at the rest of humanity. And it hadn’t cost him any effort, intellectual or otherwise. As Marshall McLuhan would put it years later: “Moral indignation is a technique used to endow the idiot with dignity.” Precisely which intellectuals of the twentieth century were or were not idiots is a debatable point, but it is hard to argue with the definition I once heard a French diplomat offer at a dinner party: “An intellectual is a person knowledgeable in one field who speaks out only in others.”

 

‹ Prev