Had I Known
Page 11
So I show up at the middle-school track where the relay’s going on just in time for the Survivors’ March: About 100 people, including a few men, since the funds raised will go to cancer research in general, are marching around the track eight to twelve abreast while a loudspeaker announces their names and survival times and a thin line of observers, mostly people staffing the raffle and food booths, applauds. It could be almost any kind of festivity, except for the distinctive stacks of cellophane-wrapped pink Hope Bears for sale in some of the booths. I cannot help but like the kinky small-town gemütlichkeit of the event, especially when the audio system strikes up that universal anthem of solidarity, “We Are Family,” and a few people of various ages start twisting to the music on the jerry-rigged stage. But the money raised is going far away, to the American Cancer Society, which will not be asking us for our advice on how to spend it.
I approach a woman I know from other settings, one of our local intellectuals, as it happens, decked out here in a pink-and-yellow survivor T-shirt and with an American Cancer Society “survivor medal” suspended on a purple ribbon around her neck. “When do you date your survivorship from?” I ask her, since the announced time, five and a half years, seems longer than I recall. “From diagnosis or the completion of your treatments?” The question seems to annoy or confuse her, so I do not press on to what I really want to ask: At what point, in a downwardly sloping breast-cancer career, does one put aside one’s survivor regalia and admit to being, in fact, a die-er? For the dead are with us even here, though in much diminished form. A series of paper bags, each about the right size for a junior burger and fries, line the track. On them are the names of the dead, and inside each is a candle that will be lit later, after dark, when the actual relay race begins.
My friend introduces me to a knot of other women in survivor gear, breast-cancer victims all, I learn, though of course I would not use the V-word here. “Does anyone else have trouble with the term ‘survivor’?” I ask, and, surprisingly, two or three speak up. It could be “unlucky,” one tells me; it “tempts fate,” says another, shuddering slightly. After all, the cancer can recur at any time, either in the breast or in some more strategic site. No one brings up my own objection to the term, though: that the mindless triumphalism of “survivorhood” denigrates the dead and the dying. Did we who live “fight” harder than those who’ve died? Can we claim to be “braver,” better, people than the dead? And why is there no room in this cult for some gracious acceptance of death, when the time comes, which it surely will, through cancer or some other misfortune?
No, this is not my sisterhood. For me at least, breast cancer will never be a source of identity or pride. As my dying correspondent Gerri wrote: “IT IS NOT OK!” What it is, along with cancer generally or any slow and painful way of dying, is an abomination, and, to the extent that it’s human-made, also a crime. This is the one great truth that I bring out of the breast-cancer experience, which did not, I can now report, make me prettier or stronger, more feminine or spiritual—only more deeply angry. What sustained me through the “treatments” is a purifying rage, a resolve, framed in the sleepless nights of chemotherapy, to see the last polluter, along with, say, the last smug health insurance operative, strangled with the last pink ribbon. Cancer or no cancer, I will not live that long, of course. But I know this much right now for sure: I will not go into that last good night with a teddy bear tucked under my arm.
1In the United States, one in eight women will be diagnosed with breast cancer at some point. The chances of her surviving for five years are 86.8 percent. For a black woman this falls to 72 percent; and for a woman of any race whose cancer has spread to the lymph nodes, to 77.7 percent.
2Some improved prognostic tools, involving measuring a tumor’s growth rate and the extent to which it is supplied with blood vessels, are being developed but are not yet in use.
The Naked Truth
about Fitness
Lear’s, 1990
The conversation has all the earmarks of a serious moral debate. The man is holding out for the pleasures of this life, few as they are and short as it is. The woman (we assume his wife, since they are having breakfast together and this is a prime-time television commercial) defends the high road of virtue and self-denial. We know there will be a solution, that it will taste like fresh-baked cookies and will simultaneously lower cholesterol, fight osteoporosis, and melt off unwholesome flab. We know this. What we have almost forgotten to wonder about is this: Since when is breakfast cereal a moral issue?
Morality is no longer a prominent feature of civil society. In the 1980s, politicians abandoned it, Wall Street discarded it, televangelists defiled it. Figuratively speaking, we went for the sucrose rush and forgot the challenge of fiber. But only figuratively. For as virtue drained out of our public lives, it reappeared in our cereal bowls, our exercise regimens, and our militant responses to cigarette smoke, strong drink, and greasy food.
We redefined virtue as health. And considering the probable state of our souls, this was not a bad move. By relocating the seat of virtue from the soul to the pecs, the abs, and the coronary arteries, we may not have become the most virtuous people on earth, but we surely became the most desperate for grace. We spend $5 billion a year on our health-club memberships, $2 billion on vitamins, nearly $1 billion on home-exercise equipment, and $6 billion on sneakers to wear out on our treadmills and StairMasters. We rejoice in activities that leave a hangover of muscle pain and in foods that might, in more temperate times, have been classified as fodder. To say we want to be healthy is to gravely understate the case. We want to be good.
Consider my own breakfast cereal, a tasteless, colorless substance that clings to the stomach lining with the avidity of Krazy Glue. Quite wisely, the box makes no promise of good taste or visual charm. Even the supposed health benefits are modestly outlined in tiny print. No, the incentive here is of a higher nature. “It is the right thing to do,” the manufacturer intones on the back of the box, knowing that, however alluring our temptations to evil, we all want to do the right thing.
The same confusion of the moral and the physical pervades my health club. “Commit to get fit!” is the current slogan, the verb reminding us of the moral tenacity that has become so elusive in our human relationships. In the locker room we sound like the inmates of a miraculously rehabilitative women’s prison, always repenting, forever resolving: “I shouldn’t have had that doughnut this morning.” “I wasn’t here for two weeks and now I’m going to pay the price.” Ours is a hierarchy of hardness. The soft, the slow, the easily tired rate no compassion, only the coldest of snubs.
Health is almost universally recognized as a kind of virtue. At least, most cultures strong enough to leave an ethnographic trace have discouraged forms of behavior that are believed to be unhealthy. Nevertheless, most of us recognize that health is not an accomplishment so much as it is a potential. My upper-body musculature, developed largely on Nautilus machines, means that I probably can chop wood or unload trucks, not that I ever will. Human cultures have valued many things—courage, fertility, craftsmanship, and deadly aim among them—but ours is almost alone in valuing not the deed itself but the mere capacity to perform it.
So what is it that drives us to run, lift, strain, and monitor our metabolisms as if we were really accomplishing something—something pure, that is, and noble? Sociologist Robert Crawford argues that outbreaks of American “healthism” coincide with bouts of middle-class anxiety. It was near the turn of the century, a time of economic turmoil and violent labor struggles, that white-collar Americans embarked on their first 1980s-style health craze. They hiked, rode bikes, lifted weights, and otherwise heeded Teddy Roosevelt’s call for “the strenuous life.” They filtered their water and fussed about bran (though sweets were heavily favored as a source of energy). On the loonier fringe, they tried “electric belts,” vibrating chairs, testicle supporters, “water cures,” prolonged mastication, and copious enemas—moralizing all the while a
bout “right living” and “the divine laws of health.”
Our own health-and-fitness craze began in another period of economic anxiety—the 1970s, when the economy slid into “stagflation” and a college degree suddenly ceased to guarantee a career above the cab-driving level. In another decade—say, the 1930s or the 1960s—we might have mobilized for economic change. But the 1970s was the era of How to Be Your Own Best Friend and Looking Out for Number One, a time in which it seemed more important, or more feasible, to reform our bodies than to change the world. Bit by bit and with the best of intentions, we began to set aside the public morality of participation and protest for the personal morality of health.
Our fascination with fitness has paid off. Fewer Americans smoke; they drink less hard liquor, eat more fiber and less fat. Our rate of heart disease keeps declining, our life expectancy is on the rise. We are less dependent on doctors, more aware of our own responsibility for our health. No doubt we feel better, too, at least those of us who have the means and the motivation to give up bourbon for Evian and poker for racquetball. I personally am more confident and probably more durable as a fitness devotee than I ever was in my former life as a chairwarmer.
But there’s a difference between health and healthism, between health as a reasonable goal and health as a transcendent value. By confusing health and virtue, we’ve gotten testier, less tolerant, and ultimately less capable of confronting the sources of disease that do not lie within our individual control. Victim blaming, for example, is an almost inevitable side effect of healthism. If health is our personal responsibility, the reasoning goes, then disease must be our fault.
I think of the friend—a thoroughly intelligent, compassionate, and (need I say?) ultrafit person—who called to tell me that her sister was facing surgery for a uterine tumor. “I can’t understand it,” my friend confided. “I’m sure she’s been working out.” Not quite enough was the implication, despite the absence of even the frailest connection between fibroids and muscle tone. But like sixteenth-century Christians, we’ve come to see every illness as a punishment for past transgressions. When Chicago mayor Harold Washington died of a heart attack, some eulogizers offered baleful mutterings about his penchant for unreformed, high-cholesterol, soul food. When we hear of someone getting cancer, we mentally scan their lifestyle for the fatal flaw—fatty foods, smoking, even “repressed anger.”
There are whole categories of disease that cannot, in good conscience, be blamed on the lifestyles or moral shortcomings of their victims. An estimated 25,000 cancer deaths a year, for example, result from exposure to the pesticides applied so lavishly in agribusiness. Ten thousand Americans are killed every year in industrial accidents; an estimated 20,000 more die from exposure to carcinogens in the workplace—asbestos, toxic solvents, radiation. These deaths are preventable, but not with any amount of oat bran or low-impact aerobics. Environmental and occupational diseases will require a far more rigorous social and political regimen of citizen action, legislation, and enforcement.
Even unhealthy lifestyles can have “environmental” as well as personal origins. Take the matter of diet and smoking. It’s easy for the middle-class fiber enthusiast to look down on the ghetto dweller who smokes cigarettes and spends her food stamps on Doritos and soda pop. But in low-income neighborhoods convenience stores and fast-food joints are often the only sources of food, while billboards and TV commercials are the primary sources of nutritional “information.” Motivation is another problem. It’s one thing to give up smoking and sucrose when life seems long and promising, quite another when it might well be short and brutal.
Statistically speaking, the joggers and bran eaters are concentrated in the white-collar upper-middle class. Blue- and pink-collar people still tend to prefer Bud to Evian and meat loaf to poached salmon. And they still smoke—at a rate of 51 percent, compared with 35 percent for people in professional and managerial occupations. These facts should excite our concern: Why not special cardiovascular-fitness programs for the assembly-line worker as well as the executive? Reduced-rate health-club memberships for truck drivers and typists? Nutritional supplements for the down-and-out? Instead, healthism tends to reinforce longstanding prejudices. If healthy habits are an expression of moral excellence, then the working class is not only “tacky,” ill-mannered, or whatever else we’ve been encouraged to believe—it’s morally deficient.
Thus, perversely, does healthism ease the anxieties of the affluent. No amount of straining against muscle machines can save laid-off workers; no aerobic exercises can reduce the price of a private-school education. But fitness can give its practitioners a sense of superiority over the potbellied masses. On the other side of victim blaming is an odious mood of self-congratulation: “We” may not be any smarter or more secure about our futures. But surely we are more disciplined and pure.
In the end, though—and the end does come—no one is well served by victim blaming. The victim isn’t always “someone else,” someone fatter, lazier, or more addicted to smoke and grease. The fact is that we do die, all of us, and that almost all of us will encounter disease, disability, and considerable discomfort either in the process or along the way. The final tragedy of healthism is that it leaves us so ill prepared for the inevitable. If we believe that health is a sign of moral purity and anything less is a species of sin, then death condemns us all as failures. Longevity is not a resoundingly interesting lifetime achievement, just as working out is not exactly a life’s work.
Somehow, we need to find our way back to being healthy without being healthist. Health is great. It makes us bouncier and probably happier. Better yet, it can make us fit for something: strong enough to fight the big-time polluters, for example, the corporate waste dumpers; tough enough to take on economic arrangements that condemn so many to poverty and to dangerous occupations; lean and powerful enough to demand a more nurturing, less anxiety-ridden social order.
Health is good. But it is not, as even the ancient and athletic Greeks would have said, the good.
Got Grease?
Los Angeles Times, 2002
It’s not only the stock market that has the upper classes biting their fingernails. In the last few years, the low-fat, high-carb way of life that was central to the self-esteem of the affluent has been all but discredited. If avarice was the principal vice of the bourgeoisie, a commitment to low fat was its counterbalancing virtue. You can bet, for example, that those CEOs who cooked the books and ransacked their companies’ assets did not start the day with two eggs over easy, a rasher of bacon, and a side of hash browns. No, unbuttered low-fat muffins and delicate slices of melon fueled the crimes of Wall Street: Grease was for proles.
But that dogma no longer holds up. A large number of nutritionists now deny that the low-fat approach will make you slim and resistant to heart disease. As we know, the onset of the American epidemic of obesity coincided precisely with the arrival of the antifat campaign in the 1980s, accompanied by a cornucopia of low-fat cookies, cakes, potato chips, and frozen pot-roast dinners. Millions of Americans began to pig out on “guilt-free” feasts of ungarnished carbs—with perverse and often debilitating results, especially among those unable to afford health club memberships and long hours on the elliptical trainer.
I have confirmed these findings with my own scientific study, which draws on a sample of exactly two: Jane Brody, the New York Times health columnist and tireless opponent of all foodstuffs other than veggies and starch, and me. It was Brody, more than anyone else, who promoted the low-fat way of life to the masses, from the eighties on, with headlines like “Our Excessive Protein Intake Can Hurt Liver, Kidneys, Bone,” “Fill Up on Bread to Lose Weight,” and “Chemicals in Food Less Harmful Than Fat.”
As she revealed in a 1999 column, Brody was herself raised on a high-carb, low-fat diet of “shredded wheat, oatmeal, challah, Jewish rye and bagels,” the last, presumably, unblemished by the customary “shmear” of cream cheese. I, meanwhile, was raised on a diet that might stra
in even an Inuit’s gallbladder. We ate eggs every morning, meat for lunch, and meat again for dinner, invariably accompanied by gravy or at least pan drippings. We buttered everything from broccoli to brownies and would have buttered butter itself if it were not for the problems of traction presented by the butter-butter interface.
And how did Brody and I exit from our dietarily opposite childhoods? She, by her own admission, was a veritable butterball by her mid-twenties—a size 14 at just under five feet tall. I, at five-foot-seven, weighed in at a gaunt and geeky 110 pounds.
Fast-forwarding to the present, we assume Brody is now admirably trim, if only because of her exercise regimen, since otherwise she wouldn’t have dared to promote the low-fat dogma in person. For my part, I no longer butter my brownies, perhaps in part because of Brody’s tireless preaching. But the amount of fat she recommends for an entire day—one tablespoon—wouldn’t dress a small salad for me or lubricate a single Triscuit. I still regard bread as a vehicle for butter and chicken as an excuse for gravy or, when served cold, mayonnaise. The result? I’m a size 6 and have a cholesterol level that an envious doctor once denounced as “too low.” Case closed.
But if that doesn’t convince you, there’s now a solid medical explanation for why the low-fat, high-carb approach is actually fattening. A meal of carbs—especially those derived from sugar and refined flour—is followed by a surge of blood sugar, then, as insulin is released in response, a sudden collapse, leaving you often light-headed, cranky, headachy, and certainly hungrier than before you ate. Fats and protein can make you fat, too, of course, if ingested in sufficient quantity, but at least they fulfill the conventional role of anything designated as a foodstuff, which is to say that they leave you feeling like you’ve actually eaten something.