by Mark Greif
A second solution would be the trivialization of sex. This is much harder, because every aspect of the culture is so much against it, counterliberators and prudes included. Aldous Huxley warned of a world in which we’d arrange sexual intercourse as we make dates for coffee, with the same politeness and obligation. That now seems like an impossibly beautiful idyll. At least coffee dates share out assets pacifically. You meet for coffee with people you don’t really want to see, and people who don’t want to see you agree to meet you, and yet everyone manages to get something out of it. If only sex could be like coffee! But sex has not proved adaptable to this and probably never will, despite the recent overcoming of a heretofore limiting condition—the inability to summon physical arousal at will. The new pharmacopoeia of tumescence drugs will soon give way, according to reports of current clinical trials, to libido drugs that act directly on the brain rather than the vascular system—and for both men and women. I’m not optimistic they will produce a revolution in etiquette.
The reason it seems a sex of pure politeness and equal access does not work is that the constant preparation to imagine any and every other person as a sexual object (something our culture already encourages) proves to be ruthlessly egocentric and antisocial, making every other living body a tool for self-pleasure or gain. At times I wonder if we are witnessing a sexualization of the life process itself, in which all pleasure is canalized into the sexual, and the function of warm, living flesh in any form is to allow us access to autoerotism through the circuit of an other. This is echoed at the intellectual level in the discourse of “self-discovery.” The real underlying question of sexual encounter today may not be “What is he like in bed?” (heard often enough, and said without shame) but “What am I like in bed?” (never spoken). That is to say, at the deepest level, one says: “Whom do I discover myself to be in sex?”—so that sex becomes the special province of self-discovery.
Meanwhile, the more traditional way of de-emphasizing sex, by subordinating it to overwhelming romantic love, has diminished as an option as the focus on self-discovery has increasingly devitalized romantic love. Self-discovery puts a reflecting wall between the self and attention to the other, so that all energy supposedly exerted in fascination, attraction, and love just bounces back, even when it appears to go out as love for the other. When self-discovery is combined with the notion of a continually new or renewed self, and this newness is associated with literal or metaphorical youth—well, then you simply have a segment of the affluent first world at the present moment.
This means the de-emphasis of sex and the denigration of youth will have to start with an act of willful revaluation. It will require preferring the values of adulthood: intellect over enthusiasm, autonomy over adventure, elegance over vitality, sophistication over innocence—and, perhaps, a pursuit of the confirmation or repetition of experience rather than experiences of novelty.
The de-emphasis of sex and the denigration of childhood can still be put on the agenda of a humane civilization. However, I think it’s basically too late for us. Perhaps I simply mean that I know it is too late for me. If you kick at these things, you are kicking at the heart of certain systems; if you deny yourself the lure of sex, for example, or the superiority of youth, you feel you will perish from starvation. But if I can’t save myself or my children, probably, I still might help my grandchildren. The only hope would be, wherever possible, to deny ourselves in our fatuousness and build a barricade, penning us inside, quarantining this epoch that we must learn to name and disparage.
Let the future, at least, know that we were fools. Make our era distinct and closed so that the future can see something to move beyond. Record our testament, that this was a juvenile phase in liberation which must give way to a spiritual adulthood! Turn back to adults; see in the wrinkles at the side of the eye that catch the cobalt, the lines of laughter in the face, the prolific flesh, those subtle clothes of adulthood, the desire-inspiring repositories of wisdom and experience. Know that what we wish to be nourished upon is age and accomplishment, not emptiness and newness. Then, in sophisticated and depraved sexuality, rather than youth’s innocence and the fake blush of truth, let our remaining impulses run in the sex of the old for the old—until they run out. Make a model for a better era. Once more, my moderns—in a superior decadence, in adult darkness rather than juvenile light—rise to the occasion! One effort more if you wish to be liberators.
[2006]
* * *
*1 Feminist critiques of pornography rooted in an idea of male violence and revenge against the threat of women’s liberation might have predicted a different outcome in our age of equality: representations of the literal humiliation or subordination of adult women in power—bosses, senators, spokespeople. What they did not anticipate was a turn to sexualized youth. Though the two lines of critique are not at all incompatible (i.e., youth still may be a way of denying adult equality), one sees now that feminist critiques of youth and aging are proving to be more significant historically than porn critiques.
*2 I want to acknowledge two lines of thought that insist on the attraction to sexually mature children as natural, not social, contravening my account. One is the commonsense historical argument that until recently sexually mature children of the middle teen years were adults, because human beings used to marry in their teens. Natasha, the dream of Russian womanhood in War and Peace, one of the greatest novels of the nineteenth century, set in that century’s early years, is fourteen when she becomes the object of her first suitors’ attention—and admirable suitors too: hussars in the tsar’s army, and a count. Her girlishness is treated matter-of-factly by those who are drawn to it as an appealing aspect of her personality, and it is considered realistically by her parents, who are concerned she may be too immature yet to leave home and run a household. In the United States, as the historian Philip Jenkins has summarized, the standard age of sexual consent was ten years old until the 1890s, when it was raised to sixteen or eighteen depending on the state.
The other argument is one occasionally offered explicitly, but much more often implicitly, in the field of evolutionary psychology. Evolutionary psychology explains behavioral dispositions in modern human beings by the optimal strategies for passing on genes, through patterns hardwired into our brains by our evolutionary past and the continuing reproductive demands of the present. “Youth is a critical cue,” writes evolutionary psychologist David M. Buss in the standard book on the subject of sex, “since women’s reproductive value declines steadily with increasing age after twenty. By the age of forty, a woman’s reproductive capacity is low, and by fifty it is close to zero” (The Evolution of Desire). The desire for children from the moment of visible pubescence (say, twelve today) to the maximum age before reproductive decline (age twenty) may therefore be the best means for passing on genes. This inclination would be set beneath the level of consciousness, as men’s desire is targeted to females who are fertile, healthy, and poised for the longest period of childbearing possible before the decline sets in. On evolutionary-biological presuppositions, it ought to be the case that human males today and yesterday, and in every society, should be maximally attracted to newly postpubescent girls unless it be determined statistically that there is some ramping-up of reproductive success in the years after menarche—in which case, certainly, no later than fourteen or fifteen.
Neither the historical nor the biological argument seems to meet the problem of the sex child as we now know it, because, I think, neither captures our current experience of desire, in which the sex children come in only secondarily, through some kind of mediation of fancy; in our real lives adults feel the sexual appeal of other adults. Unless sexual desire is wholly unconscious, and the social level entirely a screen or delusion—a very complex delusion to cover biological determinism—then with the sex children it’s my sense that we are dealing primarily with the sexual appeal of youth rather than the actual determinative sexual attractiveness of youths, even if the latter also exi
sts (even with biological bases). The social appeal would be something like a desire for the sex child’s incipience, the child’s taste of first majority before the rules clamp down: youth as eternal becoming, in eternal novelty of experience. Apart from such fancies, the appeal of sexually mature children seems to me relatively weak in culture, not strong. But I understand that introspection is not science, and I am aware this may not satisfy partisans of determinist “natural” views.
ON FOOD
Food riots have broken out this year in Haiti, Egypt, Mozambique, and Bangladesh. In New York, eight million people look into the refrigerator wondering what to eat.
Rice prices soar on the international market as shortages trigger unrest among people for whom rice is a dietary staple. The packaging of the rice on my pantry shelf tells a story about pristine fields and rare cultivation. “To keep our rice select, we inspect each grain…This may seem like a lot of extra work to you, but we care.”
“Care” is one of the fluctuating words of our time. As CARE, it is an international rescue organization; as demotic speech, a matter of whim and interest; in official talk (as of health care), it essentially means nursing or medicine. In one sense, I could care less if I have rice. In another, I care (and care for myself) a great deal: I put all kinds of worry and concentration into whether I will have white rice, brown rice, basmati, arborio….Does brown rice leave more of the fibrous husk (for health)? Is polished rice more suitable to the Cambodian cuisine I’ll cook tonight (for experience)? And should I have salmon, which, with its omega-3s, is said to be good for my brain, heart, and mood? Or tofu, with its cholesterol-lowering soy protein, its isoflavones and selenium? Thus from taste or “choice” we stray outward to a vantage from which the wrong choice at dinner looks like death, where “care” becomes ambassador for compulsion.
Two generations ago, progress in the realm of food split along dual tracks. For those who don’t yet have enough, the goal remains to gain plenty, by any technical means available. For those whose food is assured, the task becomes to re-restrict it. This second movement has been underacknowledged.
Having had our food supply made simple, we devote ourselves to looking for ways to make it difficult. The more we are estranged from the tasks of growing and getting food, the more food thought pervades our lives. It is a form of attention that restores labor, rarity, experience, and danger to food’s appearance (its manifestations in the market and at the table) and its refusal (our rejection of unfit foods, our dieting). This parallels the new complication of other phenomena of bodily attention—specifically, modern exercise and sex. It will be objected that the care for food is a fascination only of the rich; this is false. Stretching from high to low, the commands to lose weight, to undertake every sort of diet for the purposes of health, to enjoy food as entertainment, to privatize food care as a category of inner, personal life (beyond the shared decisions of cooking and the family dinner), have communicated new thought and work concerning food to the vast middle and working classes of the rich Western countries, too.
I think there is something wrong with all this. Underlying my opposition is a presumption that our destiny could be something other than grooming—something other than monitoring and stroking our biological lives. Many readers will disagree. I respect their disagreement if they are prepared to stand up for the fundamental principle that seems to underlie their behavior: that what our freedom and leisure were made for, in our highest state, really is bodily perfection and the extension of life. One of the main features of our moment in history, in anything that affects the state of the body (though, importantly, not the life of the mind), is that we prefer optimization to simplicity. We are afraid of dying, and reluctant to miss any physical improvement. I don’t want to die, either. But I am caught between that negative desire and the wish for freedom from control. I think we barely notice how much these tricks of care take up our thinking, and what domination they exert.
—
The reason to eat food is no longer mainly hunger. There is now no point in your day when if you were to go without a meal you would fall into physical jeopardy. You “get hungry,” to be sure, but probably from birth to death never go hungry, though enough people in the world still do.*1 This is a change in life.
Confusion arises around any need that never gets fully activated. We know we have to eat because otherwise we will die. We direct our thoughts to the activation point, the terminal condition, even though we’ll never approach it. We act as if we are under compulsion for decisions none of which are determined by this need—as if our “provisional necessity” were a “fatal necessity.” “We have to eat”; but we don’t have to eat anything in particular, so extensive are our food choices, any of which is sufficient for life. (Traditional societies always existed subject to conditions of scarcity; choice was circumscribed when one’s food was given by whatever would grow.) “We have to eat”; but we don’t have to eat at any given moment, so regularly do we eat, so lavish are our meals. (Though of course you get hungry just hours after eating—from sitting, or waiting, or being. Hunger is recalibrated in half-inches where once it was measured in yards.)
It’s now considered possible for some of us to get ourselves to states in which eating once again feels medically necessary, even with respect to the timing of daily meals. “My blood sugar must be down.” “Remember to stay hydrated.” You become “hypoglycemic”: that is, lacking in sugars. You become “dehydrated”: that is to say, thirsty. You reach a point where you get lightheaded, sick, unhappy without your food. Truer states of privation can be achieved through exercise. (A kind of confusion is deliberately maintained between thirst, which activates in shorter time spans, and hunger, which is slow-building and diverse in its fulfillment—so that a glass of orange juice becomes a palliative to thirst that also contains, in the solids and sugars and vitamins and tangible substance, a kind of food.) But you don’t even have to wear yourself out to feel changes inside you. Enough of us monitor ourselves this closely in daily life. We become patients in a hypothesized emergency room, in which we move as specters.
These reduced states, and the ability to identify the feelings of them, go with a degree of discernment and class distinction. Lower-class people get hungry, and “we” get hypoglycemic. The redevelopment of biologically necessary hunger is considered morally superior to its widespread alternative, the lazy hunger of an addiction to abundance.
The poor say they want lunch. Don’t believe them. Put aside the hunger of these people who endlessly crave “junk foods” and “drug foods,” fats and sugars—the obese overeaters, one of the last classes of people it’s socially acceptable to despise. The exalted need of the momentarily dizzy armchair athlete is counterpoised to the cravings of the obese underclass.
—
Historically, the modern project of food has always been associated with an end to scarcity. Before modernity, the multiplication of food meant that the supernatural had entered the mundane: as with the miracle of the loaves and fishes, Jesus’s carbs and proteins. Not until the eighteenth and nineteenth centuries did plenty come into view as a practical possibility. Its realization crept up on theorists by gradual developments. Malthus declared it impossible that agricultural production’s carrying capacity (arithmetically increasing) could ever catch up with the level of population (geometrically increasing) less than two decades before capacity proved him wrong. Malthus wrote in 1798. The last major “subsistence crisis” to strike the Western nations en masse, according to the agricultural historian John D. Post, occurred in 1816–17. (The starvation conditions in 1845–47 in Ireland and parts of Central Europe because of potato blight he disqualifies as an isolated last catastrophe.) Periodic famine would no longer be a recurring feature of Western life, though it had been a basic condition for all human societies since the early Holocene.
The early years of the twentieth century ushered in a second transformation. The embrace of agricultural mechanization spurred a transition from th
e mere end of famine to the permanence of plenty, even overplenty. The United States led the worldwide change, which took place from about 1915 to 1970. The application of machine power, specifically the arrival of tractors, made labor hours drop despite ever larger harvests. Crop yields increased per amount of acreage cultivated, massively so with the introduction of hybrid strains of corn. Bruce L. Gardner, reanalyzing most recently the agricultural and economic data for the period, notes that US farms produce seven times the amount of food they did in 1900, while having shed two-thirds of their laborers. “As late as 1950, food consumed at home accounted for 22 percent of the average US household’s disposable income. By 1998 that percentage had been reduced to 7,” while we manage to eat more.
We live far enough after the period of the modernization of food to condescend to its achievements. The technical achievement of superabundance led to a predictable but short-lived celebration of technicized food itself: a commercial fetishism of the techniques of freezing and refrigerated transport, Swanson dinners and Birds Eye vegetables, and a lust for the re-engineering, preservation, and shelf stability that made Cheez Whiz and Pringles out of smooth Wisconsin milk and bursting Idaho potatoes. Corresponding to modernization was an early modernism of food, a recognizable trajectory through attitudes well known to us from painting or design or writing. Postwar modernization theory held that modernizing was exclusively an economic-technical achievement, one that stood apart from the sorts of aesthetic regimes that succeeded one another in progress in the arts, but it was not so. “Food science” represented a moment of human progress, recognized and regnant, transformed into culture. When mid-century food technics are satirized today by right-thinking people as kitsch—Tang, fish sticks, and Wonder Bread—a moment of utopian progress is reduced to folly, as can happen with so much of the naive ecstasy and radiance of all products of the machine triumphant.