Dancing in the Streets: A History of Collective Joy

Home > Nonfiction > Dancing in the Streets: A History of Collective Joy > Page 14
Dancing in the Streets: A History of Collective Joy Page 14

by Barbara Ehrenreich


  Just in the last few years, hundreds of books, articles, and television specials have been devoted to depression: its toll on the individual, its relationship to gender, the role of genetic factors, the efficacy of pharmaceutical treatments. But to my knowledge, no one has suggested that the epidemic may have begun in a particular historical time, and started as a result of cultural circumstances that arose at that time and have persisted or intensified since. The failure to consider historical roots may stem, in part, from the emphasis on the celebrity victims of the past, which tends to discourage a statistical, or epidemiological, perspective.l But if there was in fact a beginning to the epidemic of depression, sometime in the sixteenth or seventeenth century, it is of obvious concern here, confronting us as it does with this question: Could this apparent decline in the ability to experience pleasure be in any way connected with the decline in opportunities for pleasure, such as carnival and other traditional festivities?

  There is reason to think that something like an epidemic of depression in fact began around 1600, or the time when Burton undertook his “anatomy” of the disease. Melancholy, as it was called until the twentieth century, is of course a very ancient problem, and was described in the fifth century BCE by Hippocrates. Chaucer’s fourteenth-century characters were aware of it, and late medieval churchmen knew it as acedia, which was technically a sin, since it often led to the neglect of religious obligations. So melancholy, in some form, had always existed—and, regrettably, we have no statistical evidence of a sudden increase in early modern Europe, which had neither a psychiatric profession to do the diagnosing nor a public health establishment to record the numbers of the afflicted. All we know is that in the 1600s and 1700s, medical books about melancholy and literature with melancholic themes were both finding an eager audience, presumably at least in part among people who suffered from melancholy themselves. Samuel Johnson, for example, was an admirer of Burton’s Anatomy of Melancholy, asserting it “was the only book that ever took him out of bed two hours sooner than he wished to rise.”11

  Increasing interest in melancholy is not, however, evidence of an increase in the prevalence of actual melancholy. As the historian Roy Porter suggested, the disease may simply have been becoming more stylish, both as a medical diagnosis and as a problem, or pose, affected by the idle rich, and signifying a certain ennui or detachment. No doubt the medical prejudice that it was a disease of the gifted, or at least of the comfortable, would have made it an attractive diagnosis to the upwardly mobile and merely out-of-sorts. Nervous diseases in general, Dr. Cheyne asserted, “never happen, or can happen to any but those of the liveliest and quickest natural parts whose Faculties are the brightest and most spiritual, and whose Genius is most keen and penetrating, and particularly when there is the most delicate Sensation and Taste, both of pleasure and pain.”12

  By the mid-eighteenth century, melancholy did indeed become a stylish stance among the affluent English, inspiring the insipid sentiments expressed in poems like Thomas Warton’s “The Pleasures of Melancholy” and Elizabeth Carter’s “Ode to Melancholy,” which reads in part: “COME, Melancholy! silent power/Companion of my lonely hour … / Thou sweetly sad ideal guest.”13 In fact the notion that melancholy was an exclusively elite disease was common enough to be a subject for satire. In a mid-eighteenth-century English play, a barber complains of melancholy and is told: “Melancholy? Marry, gup, is melancholy a word for a barber’s mouth? Thou shouldst say heavy, dull, and doltish: melancholy is the crest of the courtier’s arms!”14 For their part, physicians probably were eager to diagnose melancholy, or, as it was sometimes called, “spleen,” among their better-off patients and, in general, to wrest the treatment of nervous disorders away from the clergy.

  But melancholy did not become a fashionable pose until a full century after Burton took up the subject, and when it did become stylish, we must still wonder: Why did this particular stance or attitude become fashionable and not another? An arrogant insouciance might, for example, seem more fitting to an age of imperialism than this wilting, debilitating malady; and enlightenment, another well-known theme of the era, might have been better served by a mood of questing impatience. Even when melancholy became popular as a poetic theme and social affectation, there were individual sufferers, like the poet William Cowper, who could hardly have chosen their affliction. He first fell prey in his twenties, when anxiety about an exam led to a suicide attempt that necessitated an eighteen-month stay in an asylum. Four more times in his life he was plunged into what he called “a melancholy that made me almost an infant”15 and was preserved from suicide only by institutionalization. And it is hard to believe he could have been feigning when he wrote to a friend, toward the end of his life, “I have had a terrible night—such a one as I believe I may say God knows no man ever had … Rose overwhelmed with infinite despair, and came down into the study execrating the day I was born with inexpressible bitterness.”16

  Nor can we be content with the claim that the apparent epidemic of melancholy was the cynical invention of the men who profited by writing about it, since some of these were self-identified sufferers themselves. Robert Burton confessed, “I writ of melancholy, by being busy to avoid melancholy.”17 George Cheyne was afflicted, though miraculously cured by a vegetarian diet of his own devising. The Englishman John Brown, who published a best-selling mid-nineteenth-century book on the subject of the “Increase of low Spirits and nervous Disorders,” went on to commit suicide.18 Something was happening, from about 1600 on, to make melancholy a major concern of the reading public, and the simplest explanation is that there was more melancholy around to be concerned about.

  There remains the question of whether melancholy, as experienced by people centuries ago, was the same as the disease we now know as depression. Even in today’s Diagnostic and Statistical Manual of Mental Disorders, the definitions of mental illnesses always seem a little fuzzy around the edges; and there was not even an attempt, until the eighteenth century, at a scientific or systematic nomenclature. “Melancholy,” in Burton’s account, sometimes overlaps with “hypochondria,” “hysteria,” and “vapors”—the last two being seen as particularly feminine disorders.19 But on the whole, his descriptions of melancholy could, except for the prolix language, substitute for a modern definition of depression: “Fear and Sorrow supplant [those] pleasing thoughts, suspicion, discontent, and perpetual anxiety succeed in their places” until eventually “melancholy, this feral fiend,” produces “a cankered soul macerated with cares and discontents, a being tired of life … [who] cannot endure company, light, or life itself.”20

  If we compare the accounts of melancholics from the past with one of the best subjective descriptions of depression from our own time—William Styron’s 1990 book Darkness Visible—we find what certainly looks like a reasonable concordance. Styron found himself withdrawing from other people, even abandoning his own guests at a dinner party, while Boswell said of Johnson during one of his episodes of melancholy, “He was so ill, as, notwithstanding his remarkable love of company, to be entirely averse to society, the most fatal symptom of that malady.”21 Styron listed “self-hatred” as a symptom, while the highly productive Johnson repeatedly upbraided himself for leading “a life so dissipated and useless.”22 More flamboyantly, John Bunyan bewailed what he called his “original and inward pollution”: “That was my plague and my affliction. By reason of that, I was more loathsome in my own eyes than was a toad.”23

  Another symptom mentioned by Styron is a menacing change in the appearance of the nonhuman world. Terror is externalized, like a toxin coating the landscape and every object within it. Styron found that his “beloved home for thirty years, took on … an almost palpable quality of ominousness.”24 Similarly William James, no doubt drawing from his own long struggle with the illness, wrote that to “melancholiacs,” “the world now looks remote, strange, sinister, uncanny. Its color is gone, its breath is cold.”25 These perceptions fit neatly with the sixteenth- and se
venteenth-century melancholics’ notion that the natural world was itself in a state of deterioration—crumbling, corrupt, and doomed. As John Donne put it—and I cannot think of a more apt image of the world as it appears to a depressive—“colour is decai’d” (my emphasis). 26 Hence, no doubt, Styron’s sense of immersion in a “gray drizzle of horror,” and Johnson’s repeated references to “the distresses of terrour.”27

  So I think we can conclude with some confidence that the melancholy experienced by men and women of the early modern era was in fact the same disorder we know today as depression, and that the prevalence of melancholy/depression was actually increasing in that era, at least relative to medieval times—though admittedly we have no way of knowing how substantial this increase was in statistical terms. We can return, then, to the question of the relationship between this early “epidemic of depression” and the larger theme of this book: the suppression of communal rituals and festivities. Very likely, the two phenomena are entangled in various ways. It could be, for example, that, as a result of their illness, depressed individuals lost their taste for communal festivities and even came to view them with revulsion. But there are other possibilities: First, that both the rise of depression and the decline of festivities are symptomatic of some deeper, underlying psychological change, which began about four hundred years ago and persists, in some form, in our own time. The second, more intriguing possibility is that the disappearance of traditional festivities was itself a factor contributing to depression.

  The Anxious Self

  One approaches the subject of “deeper, underlying psychological change” with some trepidation, but fortunately, in this case, many respected scholars have already visited this difficult terrain. “Historians of European culture are in substantial agreement,” Lionel Trilling wrote in 1972, “that in the late 16th and early 17th centuries, something like a mutation in human nature took place.”28 This change has been called the rise of subjectivity or the discovery of the inner self, and since it can be assumed that all people, in all historical periods, have some sense of selfhood and capacity for subjective reflection, we are really talking about an intensification, and a fairly drastic one, of the universal human capacity to face the world as an autonomous “I,” separate from, and largely distrustful of, “them.” As we saw in chapter 5, the European nobility had undergone this sort of psychological shift in their transformation from a warrior class to a collection of courtiers—away from directness and spontaneity and toward a new guardedness in relation to others. In the late sixteenth and seventeenth centuries, the change becomes far more widespread, affecting even artisans, peasants, and laborers. The new “emphasis on disengagement and self-consciousness,” as Louis Sass puts it,29 makes the individual potentially more autonomous and critical of existing social arrangements, which is all to the good. But it can also transform the individual into a kind of walled fortress, carefully defended from everyone else.

  Historians infer this psychological shift from a number of concrete changes occurring in the early modern period, first and most strikingly among the urban bourgeoisie, or upper middle class. Mirrors in which to examine oneself become popular among those who can afford them, along with self-portraits (Rembrandt painted over fifty of them) and autobiographies in which to revise and elaborate the image that one has projected to others. In bourgeois homes, public spaces that guests may enter are differentiated, for the first time, from the private spaces—bedrooms, for example—in which one may retire to let down one’s guard and truly “be oneself.” More decorous forms of entertainment—plays and operas—requiring people to remain immobilized, each in his or her separate seat, begin to provide an alternative to the promiscuously interactive and physically engaging pleasures of carnival.30 The very word self, as Trilling noted, ceases to be a mere reflexive or intensifier and achieves the status of a freestanding noun, referring to some inner core, not readily visible to others.

  The notion of a self hidden behind one’s appearance and portable from one situation to another is usually attributed to the new possibility of upward mobility. In medieval culture, you were what you appeared to be—a peasant, a man of commerce, or an aristocrat—and any attempt to assume another status would have been regarded as rank deception; sumptuary laws often barred the wealthy commoner, for example, from dressing in colors and fabrics deemed appropriate only to nobles. According to the historian Natalie Zemon Davis: “At carnival time and at other feast days, a young peasant might dress as an animal or as a person of another estate or sex and speak through that disguise … But these were temporary masks and intended for the common good.”31 But in the late sixteenth century, upward mobility was beginning to be possible or at least imaginable, making “deception” a widespread way of life. The merchant who craved an aristocratic title, the craftsman who aspired to the status of a merchant—each had to learn to play the part, an endeavor for which the system of etiquette devised in royal courts came to serve as a convenient script. You might not be a lord or a lofty burgher, but you could find out how to act like one. Hence the popularity, in seventeenth-century England, of books instructing the would-be member of the gentry in how to comport himself, write an impressive letter, and choose a socially advantageous wife.

  Hence, too, the new fascination with the theater, with its notion of an actor who is different from his or her roles. This is a notion that takes some getting used to; in the early years of the theater, actors who played the part of villains risked being assaulted by angry playgoers in the streets. Within the theater, there is a fascination with plots involving further deceptions: Shakespeare’s Portia pretends to be a doctor of law; Rosalind disguises herself as a boy; Juliet feigns her own death—to give just a few examples. Writing a few years after Shakespeare’s death, Robert Burton bemoaned the fact that acting was no longer confined to the theater, for “men like stage-players act [a] variety of parts.” It was painful, in his view, “to see a man turn himself into all shapes like a Chameleon … to act twenty parts & persons at once for his advantage … having a several face, garb, & character, for every one he meets.”32 The inner self that can change costumes and manners to suit the occasion resembles a skilled craftsperson, too busy and watchful for the pleasures of easygoing conviviality. As for the outer self projected by the inner one into the social world: Who would want to “lose oneself” in the communal excitement of carnival when that self has taken so much effort and care to construct?

  So highly is the “inner self” honored within our own culture that its acquisition seems to be an unquestionable mark of progress—a requirement, as Trilling called it, for “the emergence of modern European and American man.”33 It was, no doubt, this sense of individuality and personal autonomy, “of an untrammeled freedom to ask questions and explore,” as the historian Yi-Fu Tuan put it, that allowed men like Martin Luther and Galileo to risk their lives by defying Catholic doctrine.34 Which is preferable: a courageous, or even merely grasping and competitive, individualism, versus a medieval (or, in the case of non-European cultures, “primitive”) personality so deeply mired in community and ritual that it can barely distinguish a “self”? From the perspective of our own time, the choice, so stated, is obvious. We have known nothing else.

  But there was a price to be paid for the buoyant individualism we associate with the more upbeat aspects of the early modern period, the Renaissance and Enlightenment. As Tuan writes, “the obverse” of the new sense of personal autonomy is “isolation, loneliness, a sense of disengagement, a loss of natural vitality and of innocent pleasure in the givenness of the world, and a feeling of burden because reality has no meaning other than what a person chooses to impart to it.”35 Now if there is one circumstance indisputably involved in the etiology of depression, it is precisely this sense of isolation or, to use the term adopted by Durkheim in his late-nineteenth-century study of suicide: anomie. Durkheim used it to explain the rising rates of suicide in nineteenth-century Europe; epidemiologists invoke it to help acco
unt for the increasing prevalence of depression in our own time.36 As Durkheim saw it: “Originally society is everything, the individual nothing … But gradually things change. As societies become greater in volume and density, individual differences multiply, and the moment approaches when the only remaining bond among the members of a single human group will be that they are all men [human].”37 The flip side of the heroic autonomy that is said to represent one of the great achievements of the early modern and modern eras is radical isolation, and, with it, depression and sometimes death.

  But the new kind of personality that arose in sixteenth- and seventeenth-century Europe was by no means as autonomous and self-defining as claimed. For far from being detached from the immediate human environment, the newly self-centered individual is continually preoccupied with judging the expectations of others and his or her own success in meeting them: “How am I doing?” this supposedly autonomous “self” wants to know. “What kind of an impression am I making?” Historians speak of the interiorization that marks the new personality, meaning “the capacity for introspection and self-reflection,” but it often looks as if what has been “interiorized” is little more than the human others around one and their likely judgments of oneself.

 

‹ Prev