The Best American Essays 2012
Page 3
At the end of every year I give out what I call the Sidney Awards (named after the midcentury philosopher Sidney Hook) for the best magazine essays of the year. Every year it is my impression that the essays are better than the year before.
The same is true, I think, for the essays in this book. Yes, I had to wade through many soporific essays by people who had no truly interesting experience to relate but still wanted to write finely about it. Yes, I had to wade through a lot of essays on what seem to be the primary subjects of our era: Alzheimer’s, senior citizen homes, aging, and the death of a parent.
But I still had many jewels to choose from. I tried to pick ones that crystallize an emotion, in the belief that reading them will add to your emotional repertoire. I tried to pick ones with new or daring ideas that will alter how you look at the world. I tried, in short, to pick ones that will be useful to you. That, I’m afraid, is a middlebrow activity. But I plead guilty. I want to be improved by the things I read.
That self-improving ethos was something that was taken for granted in the mid-twentieth century, and now we’re fortified by the knowledge that the things that are most lasting and edifying are the things that lodge in the brain most deeply, which means they are emotional, enjoyable, and fun.
DAVID BROOKS
BENJAMIN ANASTAS
The Foul Reign of “Self-Reliance”
FROM The New York Times Magazine
MY FIRST EXPOSURE to the high-flown pap of Ralph Waldo Emerson’s “Self-Reliance” came in a basement classroom at the private boys’ school where I enrolled to learn the secrets of discipline and because I wanted, at age fourteen, to wear a tie. The class was early American literature, the textbook an anthology with the heft of a volume of the Babylonian Talmud; a ribbon for holding your place between “Rip Van Winkle,” by Washington Irving, and “Young Goodman Brown,” by Nathaniel Hawthorne; and a slick hardcover the same shade of green as the back side of a dollar bill.
Our teacher, let’s call him Mr. Sideways, had a windblown air, as if he had just stepped out of an open coupe, and the impenetrable self-confidence of someone who is convinced that he is liked. (He was not.) “Whoso would be a man,” he read aloud to a room full of slouching teenage boys in button-down shirts and ties stained with sloppy Joes from the dining hall, “must be a nonconformist. He who would gather immortal palms must not be hindered by the name of goodness . . . Nothing is at last sacred but the integrity of your own mind.” And then he let loose the real hokum: “Absolve you to yourself,” he read, “and you shall have the suffrage of the world.”
I am sure that Mr. Sideways lectured dutifully on transcendentalism and its founding ideas—Emerson’s “transparent eyeball” and its gift of x-ray sight; Thoreau’s flight from a life of “quiet desperation” in society to the stillness of Walden Pond; the starred ceiling of the heavens that Ralph Waldo called the “Over-Soul,” uniting us with its magnetic beams—but what I remember most about that English class was the week that Mr. Sideways told us to leave our anthologies at home so that he could lead us in a seminar in how to make a fortune in real estate by tapping the treasure-trove he referred to as “OPM,” or Other People’s Money. He drew pyramids and pie charts on the blackboard. He gave us handouts.
For years I blamed Mr. Sideways—and the money fever of the 1980s—for this weird episode of hucksterism in English class. But that was being unfair. Our teacher had merely fallen under the spell, like countless others before and after, of the most pernicious piece of literature in the American canon. The whim that inspired him to lead a seminar in house-flipping to a stupefied underage audience was Emerson’s handiwork. “All that Adam had,” he goads in his essay “Nature,” “all that Caesar could, you have and can do.” Oh, the deception! The rank insincerity! It’s just like the Devil in Muttonchops to promise an orgiastic communion fit for the gods, only to deliver a gospel of “self-conceit so intensely intellectual,” as Melville complained, “that at first one hesitates to call it by its right name.”
The excessive love of individual liberty that debases our national politics? It found its original poet in Ralph Waldo. The plague of devices that keep us staring into the shallow puddle of our dopamine reactions, caressing our touchscreens for another fix of our own importance? That’s right: it all started with Emerson’s “Self-Reliance.” Our fetish for the authentically homespun and the American affliction of ignoring volumes of evidence in favor of the flashes that meet the eye, the hunches that seize the gut? It’s Emerson again, skulking through Harvard Yard in his cravat and greasy undertaker’s waistcoat, while in his mind he’s trailing silken robes fit for Zoroaster and levitating on the grass.
Before it does another generation’s worth of damage to the American psyche, let’s put an end to the foul reign of “Self-Reliance” and let the scholars pick over the meaning of its carcass. One question first, though: Is there anything worth salvaging among the spiritualist ramblings, obscure metaphysics, and aphorisms so pandering that Joel Osteen might think twice about delivering them? Is there an essential part of Emerson’s signature essay that we’ve somehow lost sight of?
“There is a time in every man’s education,” Emerson writes, presuming, with his usual élan, to both personify his young country and issue a decree for its revival, “when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion; that though the wide universe is full of good, no kernel of nourishing corn can come to him but through his toil bestowed on the plot of ground which is given him to till.”
As the story in our high school anthology went, the citizenry that the Bard of Concord met on his strolls through the town green in the 1830s were still cowed by the sermons of their Puritan forefathers—we had read Jonathan Edwards’s “Sinners in the Hands of an Angry God” to get a taste—prone to awe when it came to the literature of distant foreign empires and too complacent on the biggest moral issues of the day: the institution of slavery and the genocide of the Indians. (At least Emerson saw well enough with his transparent eye to criticize both.) The country had every bit of God-given energy and talent and latent conviction that it needed to produce genius, he believed, but too much kowtowing to society and the approval of elders had tamed his fellows of their natural gifts (the “aboriginal Self,” he called it) and sapped them of their courage.
“Most men have bound their eyes with one or another handkerchief,” a disenchanted Emerson observed, “and attached themselves to . . . communities of opinion. This conformity makes them not false in a few particulars, but false in all particulars.” Society operates like a corporation that requires its shareholders to sacrifice their rights for the comfort of all, Emerson believed. Instead of “realities and creators,” it gives men “names and customs.”
So what is his cure for the country’s ailing soul, his recipe for our deliverance from civilization and its discontents? This is the aim of “Self-Reliance,” which Emerson culled from a series of lectures he delivered at the Masonic Temple of Boston—his “Divinity School Address” at Harvard in 1838, denounced by one listener as “an incoherent rhapsody,” had already caused an outcry—and published in his collection Essays: First Series in 1841. Cornel West has praised Emerson for his “dynamic perspective” and for his “prescription for courageous self-reliance by means of nonconformity and inconsistency.” Harold Bloom noted, in an article for the New York Times, that by “‘self-reliance’ Emerson meant the recognition of the God within us, rather than the worship of the Christian godhead.” This is the essay’s greatest virtue for its original audience: it ordained them with an authority to speak what had been reserved for only the powerful, and bowed to no greater human laws, social customs, or dictates from the pulpit. “Trust thyself: every heart vibrates to that iron string.” Or: “No law can be sacred to me but that of my nature.” Some of the lines are so ingrained in us that we know them by heart. They feel like natural law.
There is a down
side to ordaining the self with divine authority, though. We humans are fickle creatures, and natures—however sacred—can mislead us. That didn’t bother Emerson. “Speak what you think now in hard words,” Emerson exhorted, “and tomorrow speak what tomorrow thinks in hard words again, though it contradict every thing you said today.” (Memo to Mitt Romney: no more apologies for being “as consistent as human beings can be.” You’re Emersonian!)
The larger problem with the essay, and its more lasting legacy as a cornerstone of the American identity, has been Emerson’s tacit endorsement of a radically self-centered worldview. It’s a lot like the Ptolemaic model of the planets that preceded Copernicus; the sun, the moon, and the stars revolve around our portable reclining chairs, and whatever contradicts our right to harbor misconceptions—whether it be birtherism, climate science denial, or the conviction that Trader Joe’s sells good food—is the prattle of the unenlightened majority and can be dismissed out of hand.
“A man is to carry himself in the presence of all opposition,” Emerson advises, “as if every thing were titular and ephemeral but he.” If this isn’t the official motto of the 112th Congress of the United States, well, it should be. The gridlock, grandstanding, rule manipulating, and inability to compromise aren’t symptoms of national decline. We’re simply coming into our own as Emerson’s republic.
Just recently I was watching the original “Think Different” spot that reversed Apple Computer’s fortunes when it was first shown in 1997 and marked the first real triumph for Steve Jobs after returning from the wilderness to the company he helped to found. The echoes of Emerson in the ad are striking, especially in the famous voice-over narration by Richard Dreyfuss, reading a poem now known by historians and Apple’s legion of fans as “Here’s to the Crazy Ones.” The message was already familiar when it first met our ears.
In calling out to all the misfits and the rebels and the troublemakers, the “round pegs in square holes” who “see things differently” and have trouble with the rules, the ad evokes the ideal first created by Emerson of a rough-hewn outsider who changes the world through a combination of courage, tenacity, resourcefulness and that God-given wildcard, genius. While Dreyfuss narrates, archival footage of the “crazy ones” flickers on the screen in black and white: Albert Einstein leads the way, followed by Bob Dylan, the Reverend Martin Luther King Jr., a jubilant Richard Branson shaking a champagne bottle in a flight suit.
This is the problem when the self is endowed with divinity, and it’s a weakness that Emerson acknowledged: if the only measure of greatness is how big an iconoclast you are, then there really is no difference between coming up with the theory of relativity, plugging in an electric guitar, leading a civil rights movement, or spending great gobs of your own money to fly a balloon across the Atlantic. In “Self-Reliance,” Emerson addresses this potentially fatal flaw in his thinking with a principle he calls “the law of consciousness.” (It is not convincing.) Every one of us has two confessionals, he writes. At the first, we clear our actions in the mirror (a recapitulation of the dictum “trust thyself”). At the second, we consider whether we’ve fulfilled our obligations to our families, neighbors, communities, and—here Emerson can’t resist a bit of snark—our cats and dogs. Which confessional is the higher one? To whom do we owe our ultimate allegiance? It’s not even a contest.
“I have my own stern claims and perfect circle,” Emerson writes. With this one fell swoop, Emerson tips the scales in favor of his own confessional, and any hope he might have raised for creating a balance to the self’s divinity is lost. Ever since, we’ve been misreading him, or at least misapplying him. As a sad result, it has been the swagger of a man’s walk that makes his measure, and Americans’ right to love ourselves before any other that trumps all.
MARCIA ANGELL
The Crazy State of Psychiatry
FROM The New York Review of Books
The Epidemic of Mental Illness: Why?
IT SEEMS THAT AMERICANS are in the midst of a raging epidemic of mental illness, at least as judged by the increase in the numbers treated for it. The tally of those who are so disabled by mental disorders that they qualify for Supplemental Security Income (SSI) or Social Security Disability Insurance (SSDI) increased nearly two and a half times between 1987 and 2007—from 1 in 184 Americans to 1 in 76. For children, the rise is even more startling—a thirty-five-fold increase in the same two decades. Mental illness is now the leading cause of disability in children, well ahead of physical disabilities like cerebral palsy or Down syndrome, for which the federal programs were created.
A large survey of randomly selected adults, sponsored by the National Institute of Mental Health (NIMH) and conducted between 2001 and 2003, found that an astonishing 46 percent met criteria established by the American Psychiatric Association (APA) for having had at least one mental illness within four broad categories at some time in their lives. The categories were “anxiety disorders,” including, among other subcategories, phobias and post-traumatic stress disorder (PTSD); “mood disorders,” including major depression and bipolar disorders; “impulse-control disorders,” including various behavioral problems and attention deficit hyperactivity disorder (ADHD); and “substance use disorders,” including alcohol and drug abuse. Most met criteria for more than one diagnosis. Of a subgroup affected within the previous year, a third were under treatment—up from a fifth in a similar survey ten years earlier.
Nowadays treatment by medical doctors nearly always means psychoactive drugs, that is, drugs that affect the mental state. In fact, most psychiatrists treat only with drugs, and refer patients to psychologists or social workers if they believe psychotherapy is also warranted. The shift from “talk therapy” to drugs as the dominant mode of treatment coincides with the emergence over the past four decades of the theory that mental illness is caused primarily by chemical imbalances in the brain that can be corrected by specific drugs. That theory became broadly accepted, by the media and the public as well as by the medical profession, after Prozac came to market in 1987 and was intensively promoted as a corrective for a deficiency of serotonin in the brain. The number of people treated for depression tripled in the following ten years, and about 10 percent of Americans over age six now take antidepressants. The increased use of drugs to treat psychosis is even more dramatic. The new generation of antipsychotics, such as Risperdal, Zyprexa, and Seroquel, have replaced cholesterol-lowering agents as the top-selling class of drugs in the U.S.
What is going on here? Is the prevalence of mental illness really that high and still climbing? Particularly if these disorders are biologically determined and not a result of environmental influences, is it plausible to suppose that such an increase is real? Or are we learning to recognize and diagnose mental disorders that were always there? On the other hand, are we simply expanding the criteria for mental illness so that nearly everyone has one? And what about the drugs that are now the mainstay of treatment? Do they work? If they do, shouldn’t we expect the prevalence of mental illness to be declining, not rising?
These are the questions, among others, that concern the authors of the three provocative books under review here. They come at the questions from different backgrounds—Irving Kirsch is a psychologist at the University of Hull in the UK, Robert Whitaker a journalist and previously the author of a history of the treatment of mental illness called Mad in America (2001), and Daniel Carlat a psychiatrist who practices in a Boston suburb and publishes a newsletter and blog about his profession.
The authors emphasize different aspects of the epidemic of mental illness. Kirsch is concerned with whether antidepressants work. Whitaker, who has written an angrier book, takes on the entire spectrum of mental illness and asks whether psychoactive drugs create worse problems than they solve. Carlat, who writes more in sorrow than in anger, looks mainly at how his profession has allied itself with, and is manipulated by, the pharmaceutical industry. But despite their differences, all three are in remarkable agreement on some im
portant matters, and they have documented their views well.
First, they agree on the disturbing extent to which the companies that sell psychoactive drugs—through various forms of marketing, both legal and illegal, and what many people would describe as bribery—have come to determine what constitutes a mental illness and how the disorders should be diagnosed and treated. This is a subject to which I’ll return.
Second, none of the three authors subscribes to the popular theory that mental illness is caused by a chemical imbalance in the brain. As Whitaker tells the story, that theory had its genesis shortly after psychoactive drugs were introduced in the 1950s. The first was Thorazine (chlorpromazine), which was launched in 1954 as a “major tranquilizer” and quickly found widespread use in mental hospitals to calm psychotic patients, mainly those with schizophrenia. Thorazine was followed the next year by Miltown (meprobamate), sold as a “minor tranquilizer” to treat anxiety in outpatients. And in 1957, Marsilid (iproniazid) came on the market as a “psychic energizer” to treat depression.
In the space of three short years, then, drugs had become available to treat what at that time were regarded as the three major categories of mental illness—psychosis, anxiety, and depression—and the face of psychiatry was totally transformed. These drugs, however, had not initially been developed to treat mental illness. They had been derived from drugs meant to treat infections, and were found only serendipitously to alter the mental state. At first, no one had any idea how they worked. They simply blunted disturbing mental symptoms. But over the next decade, researchers found that these drugs, and the newer psychoactive drugs that quickly followed, affected the levels of certain chemicals in the brain.