Bright-Sided

Home > Nonfiction > Bright-Sided > Page 2
Bright-Sided Page 2

by Barbara Ehrenreich


  In addition, positive thinking has made itself useful as an apology for the crueler aspects of the market economy. If optimism is the key to material success, and if you can achieve an optimistic outlook through the discipline of positive thinking, then there is no excuse for failure. The flip side of positivity is thus a harsh insistence on personal responsibility: if your business fails or your job is eliminated, it must because you didn’t try hard enough, didn’t believe firmly enough in the inevitability of your success. As the economy has brought more layoffs and financial turbulence to the middle class, the promoters of positive thinking have increasingly emphasized this negative judgment: to be disappointed, resentful, or downcast is to be a “victim” and a “whiner.”

  But positive thinking is not only a water carrier for the business world, excusing its excesses and masking its follies. The promotion of positive thinking has become a minor industry in its own right, producing an endless flow of books, DVDs, and other products; providing employment for tens of thousands of “life coaches,” “executive coaches,” and motivational speakers, as well as for the growing cadre of professional psychologists who seek to train them. No doubt the growing financial insecurity of the middle class contributes to the demand for these products and services, but I hesitate to attribute the commercial success of positive thinking to any particular economic trend or twist of the business cycle. America has historically offered space for all sorts of sects, cults, faith healers, and purveyors of snake oil, and those that are profitable, like positive thinking, tend to flourish.

  At the turn of the twenty-first century, American optimism seemed to reach a manic crescendo. In his final State of Union address in 2000, Bill Clinton struck a triumphal note, proclaiming that “never before has our nation enjoyed, at once, so much prosperity and social progress with so little internal crisis and so few external threats.” But compared with his successor, Clinton seemed almost morose. George W. Bush had been a cheerleader in prep school, and cheerleading—a distinctly American innovation—could be considered the athletically inclined ancestor of so much of the coaching and “motivating” that has gone into the propagation of positive thinking. He took the presidency as an opportunity to continue in that line of work, defining his job as that of inspiring confidence, dispelling doubts, and pumping up the national spirit of self-congratulation. If he repeatedly laid claim to a single adjective, it was “optimistic.” On the occasion of his sixtieth birthday, he told reporters he was “optimistic” about a variety of foreign policy challenges, offering as an overview, “I’m optimistic that all problems will be solved.” Nor did he brook any doubts or hesitations among his close advisers. According to Bob Woodward, Condoleezza Rice failed to express some of her worries because, she said, “the president almost demanded optimism. He didn’t like pessimism, hand-wringing or doubt.” 6

  Then things began to go wrong, which is not in itself unusual but was a possibility excluded by America’s official belief that things are good and getting better. There was the dot-com bust that began a few months after Clinton’s declaration of unprecedented prosperity in his final State of the Union address, then the terrorist attack of September 11, 2001. Furthermore, things began to go wrong in a way that suggested that positive thinking might not guarantee success after all, that it might in fact dim our ability to fend off real threats. In her remarkable book, Never Saw It Coming: Cultural Challenges to Envisioning the Worst, sociologist Karen Cerulo recounts a number of ways that the habit of positive thinking, or what she calls optimistic bias, undermined preparedness and invited disaster. She quotes Newsweek reporters Michael Hirsch and Michael Isikoff, for example, in their conclusion that “a whole summer of missed clues, taken together, seemed to presage the terrible September of 2001.” 7 There had already been a terrorist attack on the World Trade Center in 1993; there were ample warnings, in the summer of 2001, about a possible attack by airplane, and flight schools reported suspicious students like the one who wanted to learn how to “fly a plane but didn’t care about landing and takeoff.” The fact that no one—the FBI, the INS, Bush, or Rice—heeded these disturbing cues was later attributed to a “failure of imagination.” But actually there was plenty of imagination at work—imagining an invulnerable nation and an ever-booming economy—there was simply no ability or inclination to imagine the worst.

  A similar reckless optimism pervaded the American invasion of Iraq. Warnings about possible Iraqi resistance were swept aside by leaders who promised a “cakewalk” and envisioned cheering locals greeting our troops with flowers. Likewise, Hurricane Katrina was not exactly an unanticipated disaster. In 2002, the New Orleans Times-Picayune ran a Pulitzer Prize–winning series warning that the city’s levees could not protect it against the storm surge brought on by a category 4 or 5 hurricane. In 2001, Scientific American had issued a similar warning about the city’s vulnerability. 8 Even when the hurricane struck and levees broke, no alarm bells went off in Washington, and when a New Orleans FEMA official sent a panicky e-mail to FEMA director Michael Brown, alerting him to the rising number of deaths and a shortage of food in the drowning city, he was told that Brown would need an hour to eat his dinner in a Baton Rouge restaurant. 9 Criminal negligence or another “failure of imagination”? The truth is that Americans had been working hard for decades to school themselves in the techniques of positive thinking, and these included the reflexive capacity for dismissing disturbing news.

  The biggest “come-uppance,” to use Krugman’s term, has so far been the financial meltdown of 2007 and the ensuing economic crisis. By the late first decade of the twenty-first century, as we shall see in the chapters that follow, positive thinking had become ubiquitous and virtually unchallenged in American culture. It was promoted on some of the most widely watched talk shows, like Larry King Live and the Oprah Winfrey Show; it was the stuff of runaway best sellers like the 2006 book The Secret; it had been adopted as the theology of America’s most successful evangelical preachers; it found a place in medicine as a potential adjuvant to the treatment of almost any disease. It had even penetrated the academy in the form of the new discipline of “positive psychology,” offering courses teaching students to pump up their optimism and nurture their positive feelings. And its reach was growing global, first in the Anglophone countries and soon in the rising economies of China, South Korea, and India.

  But nowhere did it find a warmer welcome than in American business, which is, of course, also global business. To the extent that positive thinking had become a business itself, business was its principal client, eagerly consuming the good news that all things are possible through an effort of mind. This was a useful message for employees, who by the turn of the twenty-first century were being required to work longer hours for fewer benefits and diminishing job security. But it was also a liberating ideology for top-level executives. What was the point in agonizing over balance sheets and tedious analyses of risks—and why bother worrying about dizzying levels of debt and exposure to potential defaults—when all good things come to those who are optimistic enough to expect them?

  I do not write this in a spirit of sourness or personal disappointment of any kind, nor do I have any romantic attachment to suffering as a source of insight or virtue. On the contrary, I would like to see more smiles, more laughter, more hugs, more happiness and, better yet, joy. In my own vision of utopia, there is not only more comfort, and security for everyone—better jobs, health care, and so forth—there are also more parties, festivities, and opportunities for dancing in the streets. Once our basic material needs are met—in my utopia, anyway—life becomes a perpetual celebration in which everyone has a talent to contribute. But we cannot levitate ourselves into that blessed condition by wishing it. We need to brace ourselves for a struggle against terrifying obstacles, both of our own making and imposed by the natural world. And the first step is to recover from the mass delusion that is positive thinking.

  ONE

  Smile or Die:

  The
Bright Side of Cancer

  The first attempt to recruit me into positive thinking occurred at what has been, so far, the low point of my life. If you had asked me, just before the diagnosis of cancer, whether I was an optimist or a pessimist, I would have been hard-pressed to answer. But on health-related matters, as it turned out, I was optimistic to the point of delusion. Nothing had so far come along that could not be controlled by diet, stretching, Advil, or, at worst, a prescription. So I was not at all alarmed when a mammogram—undertaken as part of the routine cancer surveillance all good citizens of HMOs or health plans are expected to submit to once they reach the age of fifty—aroused some “concern” on the part of the gynecologist. How could I have breast cancer? I had no known risk factors, there was no breast cancer in the family, I’d had my babies relatively young and nursed them both. I ate right, drank sparingly, worked out, and, besides, my breasts were so small that I figured a lump or two would probably improve my figure. When the gynecologist suggested a follow-up mammogram four months later, I agreed only to placate her.

  I thought of it as one of those drive-by mammograms, one stop in a series of mundane missions including post office, supermarket, and gym, but I began to lose my nerve in the changing room, and not only because of the kinky necessity of baring my breasts and affixing tiny X-ray opaque stars to the tip of each nipple. The changing room, really just a closet off the stark, windowless space that housed the mammogram machine, contained something far worse, I noticed for the first time—an assumption about who I am, where I am going, and what I will need when I get there. Almost all of the eye-level space had been filled with photocopied bits of cuteness and sentimentality: pink ribbons, a cartoon about a woman with iatrogenically flattened breasts, an “Ode to a Mammogram,” a list of the “Top Ten Things Only Women Understand” (“Fat Clothes” and “Eyelash Curlers,” among them), and, inescapably, right next to the door, the poem “I Said a Prayer for You Today,” illustrated with pink roses.

  It went on and on, this mother of all mammograms, cutting into gym time, dinnertime, and lifetime generally. Sometimes the machine didn’t work, and I got squished into position to no purpose at all. More often, the X-ray was successful but apparently alarming to the invisible radiologist, off in some remote office, who called the shots and never had the courtesy to show her face with an apology or an explanation. I tried pleading with the technician to speed up the process, but she just got this tight little professional smile on her face, either out of guilt for the torture she was inflicting or because she already knew something that I was going to be sorry to find out for myself. For an hour and a half the procedure was repeated: the squishing, the snapshot, the technician bustling off to consult the radiologist and returning with a demand for new angles and more definitive images. In the intervals while she was off with the doctor I read the New York Times right down to the personally irrelevant sections like theater and real estate, eschewing the stack of women’s magazines provided for me, much as I ordinarily enjoy a quick read about sweatproof eyeliners and “fabulous sex tonight,” because I had picked up this warning vibe in the changing room, which, in my increasingly anxious state, translated into: femininity is death. Finally there was nothing left to read but one of the free local weekly newspapers, where I found, buried deep in the classifieds, something even more unsettling than the growing prospect of major disease—a classified ad for a “breast cancer teddy bear” with a pink ribbon stitched to its chest.

  Yes, atheists pray in their foxholes—in this case, with a yearning new to me and sharp as lust, for a clean and honorable death by shark bite, lightning strike, sniper fire, car crash. Let me be hacked to death by a madman, was my silent supplication—anything but suffocation by the pink sticky sentiment embodied in that bear and oozing from the walls of the changing room. I didn’t mind dying, but the idea that I should do so while clutching a teddy and with a sweet little smile on my face—well, no amount of philosophy had prepared me for that.

  The result of the mammogram, conveyed to me by phone a day later, was that I would need a biopsy, and, for some reason, a messy, surgical one with total anesthesia. Still, I was not overly perturbed and faced the biopsy like a falsely accused witch confronting a trial by dunking: at least I would clear my name. I called my children to inform them of the upcoming surgery and assured them that the great majority of lumps detected by mammogram—80 percent, the radiology technician had told me—are benign. If anything was sick, it was that creaky old mammogram machine.

  My official induction into breast cancer came about ten days later with the biopsy, from which I awoke to find the surgeon standing perpendicular to me, at the far end of the gurney, down near my feet, stating gravely, “Unfortunately, there is a cancer.” It took me all the rest of that drug-addled day to decide that the most heinous thing about that sentence was not the presence of cancer but the absence of me—for I, Barbara, did not enter into it even as a location, a geographical reference point. Where I once was—not a commanding presence perhaps but nonetheless a standard assemblage of flesh and words and gesture—“there is a cancer.” I had been replaced by it, was the surgeon’s implication. This was what I was now, medically speaking.

  In my last act of dignified self-assertion, I requested to see the pathology slides myself. This was not difficult to arrange in our small-town hospital, where the pathologist turned out to be a friend of a friend, and my rusty Ph.D. in cell biology (Rockefeller University, 1968) probably helped. He was a jolly fellow, the pathologist, who called me “hon” and sat me down at one end of the dual-head microscope while he manned the other and moved a pointer through the field. These are the cancer cells, he said, showing up blue because of their overactive DNA. Most of them were arranged in staid semicircular arrays, like suburban houses squeezed into cul-de-sacs, but I also saw what I knew enough to know I did not want to see: the characteristic “Indian files” of cells on the march. The “enemy,” I was supposed to think—an image to save up for future exercises in “visualization” of their violent deaths at the hands of the body’s killer cells, the lymphocytes and macrophages.

  But I was impressed, against all rational self-interest, by the energy of these cellular conga lines, their determination to move on out from the backwater of the breast to colonize lymph nodes, bone marrow, lungs, and brain. These are, after all, the fanatics of Barbara-ness, the rebel cells that have realized that the genome they carry, the genetic essence of me in whatever deranged form, has no further chance of normal reproduction in the postmenopausal body we share, so why not just start multiplying like bunnies and hope for a chance to break out?

  After the visit to the pathologist, my biological curiosity dropped to a lifetime nadir. I know women who followed up their diagnoses with weeks or months of self-study, mastering their options, interviewing doctor after doctor, assessing the damage to be expected from the available treatments. But I could tell from a few hours of investigation that the career of a breast cancer patient had been pretty well mapped out in advance: you may get to negotiate the choice between lumpectomy and mastectomy, but lumpectomy is commonly followed by weeks of radiation, and in either case if the lymph nodes turn out, upon dissection, to be invaded—or “involved,” as it’s less threateningly put—you’re doomed to months of chemotherapy, an intervention that is on a par with using a sledge hammer to swat mosquitoes. Chemotherapy agents damage and kill not just cancer cells but any normal body cells that happen to be dividing, such as those in the skin, hair follicles, stomach lining, and bone marrow (which is the source of all blood cells, including immune cells). The results are baldness, nausea, mouth sores, immunosuppression, and, in many cases, anemia.

  These interventions do not constitute a “cure” or anything close, which is why the death rate from breast cancer had changed very little between the 1930s, when mastectomy was the only treatment available, and 2000, when I received my diagnosis. Chemotherapy, which became a routine part of breast cancer treatment in the eighties, does
not confer anywhere near as decisive an advantage as patients are often led to believe. It’s most helpful for younger, premenopausal women, who can gain a 7 to 11 percentage point increase in ten-year survival rates, but most breast cancer victims are older, postmenopausal women like myself, for whom chemotherapy adds only a 2 or 3 percentage point difference, according to America’s best-known breast cancer surgeon, Susan Love. 1 So yes, it might add a few months to your life, but it also condemns you to many months of low-level sickness.

  In fact, there’s been a history of struggle over breast cancer treatments. In the seventies, doctors were still performing radical mastectomies that left patients permanently disabled on the affected side—until women’s health activists protested, insisting on less radical, “modified” mastectomies. It had also been the practice to go directly from biopsy to mastectomy while the patient was anesthetized and unable to make any decisions—again, until enough women protested. Then, in the nineties, there was a brief fad of treating patients whose cancers had metastasized by destroying all their bone marrow with high-dose chemotherapy and replacing it with bone marrow transplants—an intervention that largely served to hasten the patient’s death. Chemotherapy, radiation, and so on may represent state-of-the-art care today, but so, at one point in medical history, did the application of leeches.

 

‹ Prev