The United States ranks near the middle of the nations reporting to the WHO. Although its rate of 11.0 in 2002 was close to what it was at the turn of the twentieth century (10.2), it has fluctuated over time. During periods of economic depression there is more suicide; during times of war, when, as Durkheim pointed out, personal woes are overshadowed by the larger conflict, there is less. During World War I the rate dipped from 16.2 in 1915 to 11.5 in 1919 before rising steadily in the twenties. The suicide rate crested during the Depression, reaching its apex in 1932 at 17.4. As the economy stabilized, so did the suicide rate, and by 1936 it had dropped to 14.3. During World War II the rate sank to a low of 10.0. After the war it rose slightly, and ever since it has remained fairly constant, ranging from a low of 9.8 in 1957 to a high of 13.1 in 1977. (Not all wars have an ameliorative effect on the rate. Although the rate of suicide among Vietnam veterans has been high, the war itself had little impact on the country’s rate, perhaps because it was so controversial, fragmenting rather than uniting the citizens as did the more “popular” world wars. Similarly, the war in Iraq will likely have little effect on the national rate, although at least twenty army men and women serving in Iraq took their own lives during the first year of the war and seven others killed themselves not long after returning home—a rate of suicide nearly a third higher than the army’s historical average.)
Many studies have shown that suicide rates fluctuate according to the economy. When the United States rate is graphed against economic indicators over time, the two lines nearly reverse each other. A growing body of research links unemployment and ill health, suggesting that the stress of joblessness triggers problems in marriages, conflicts with children, and physical and mental difficulties among vulnerable people. Examining data from 1940 to 1970, sociologist M. Harvey Brenner of John Hopkins University estimated that when unemployment rises one percentage point, 4.1 percent more people complete suicide.
Within the United States the suicide rate varies widely. Nevada has long had the highest rate of any state, consistently twice that of the nation as a whole. Las Vegas and Reno are a magnet for the transient, the divorced, and others hoping to reverse their fortunes. Other states with consistently high rates are Florida, Arizona, Colorado, Wyoming, Alaska, Montana, New Mexico, Oregon, and California—all Western states with the exception of Florida, whose high rate can probably be accounted for by its unusually high proportion of elderly citizens. Some attribute high Western rates to the stereotypical image of the Western male as tough, unemotional, and willing to use violence as a solution (and, with the West’s high rate of gun ownership, to a speedy, lethal means of effecting it). Others hypothesize an “end of the road” theory, suggesting that people often move West with the expectation of changing their life, but when their problems persist, they may become disappointed, hopeless, and suicidal. Indeed, ever since the early nineteenth century, statistics have shown that Americans who move within the United States are at higher risk of killing themselves. “The suicide rate seems to mirror American migrations,” writes psychiatrist Howard Kushner in Self-Destruction in the Promised Land. “. . . It is a historical rule of thumb that wherever the in-migration is the greatest as a percentage of the total population, so is the overall suicide rate.” (Over the past several decades, in fact, with increased migration to the sun belt, rates in the South and Southwest have been rising, although this is attributable in part to the advanced age of many of those sun-seekers.) Indeed, the lowest rates are generally found in the relatively more stable Northeast. New England has had a consistently low rate, which some credit to “Yankee fortitude,” although this explanation has been contradicted by the recent appearance among the states with the highest rates of Vermont, where Yankee fortitude had been thought to be of a particularly potent strain.
Migration affects the suicide rate; immigration may have an even greater effect, adjusting to a country being even more disorienting than adjusting to a new state. First-generation immigrants have rates more proportionate to those of their homelands—albeit two or three times higher—than to those of their adopted country, although the rates converge toward that of the host nation over time. For instance, German, Austrian, and Scandinavian immigrants to the United States have extraordinarily high rates, while Italian, Irish, and Greek immigrants have relatively low rates. Danish psychiatrists have pointed to the high rate of Norwegian emigration to the USA as the cause of Norway’s low suicide rate compared to Denmark or Sweden. They have argued that depressed and suicidal Norwegians emigrated and became subsumed in American statistics. Among Scandinavian immigrants, however, the Danish and Swedish rates remain two or three times the Norwegian rate.
Ever since statistics on suicide were first kept, researchers and reformers have suggested that the rate of suicide is lower in the country than in the city, where, as one sociologist put it in 1905, “the struggle for existence is carried on with the greatest keenness, and . . . nervous tension reaches its highest pitch.” Chief blame for the rising suicide rates of the nineteenth and twentieth centuries was placed on “urbanization.” But in this country the difference between urban and rural rates has become less pronounced in recent decades, and studies from around the world now show higher rates of suicide in rural than in urban areas. (In China, for example, the rate is two to five times greater in rural regions than in cities.) In the United States, the change may in part be due to the increasingly hard-pressed economy in rural areas—dramatically expressed by the rash of suicides among bankrupted farmers in the eighties and nineties—as well as by limited access to mental health services and emergency care, greater availability of firearms, and a reluctance by a traditionally self-reliant population to reach out for help.
Just as it was long assumed that higher suicide rates were to be found in the city, it was also assumed that the larger the city, the higher the rate. This is not always true. As early as 1928, sociologist Ruth Cavan pointed out that the suicide rate depends less on a city’s size than on its age. Rates tended to be highest in relatively new cities like San Francisco, Oakland, Los Angeles, and Seattle, where traditional social institutions—church, family, schools—were more fragmented. Indeed, in well-established Eastern cities such as Chicago and Philadelphia, the rates are moderate, and people are often surprised to learn that New York City’s rate is far lower than that of the country as a whole, leading some to suggest that the grit of that city cultivates a survival mentality. Rates vary not only from city to city but within cities themselves, being most prevalent in extremely wealthy sections and neighborhoods with shifting populations. A study of Minneapolis suicides from 1928 to 1932 found them concentrated in the center of the city, an area of rooming houses and cheap hotels that the researcher called “a land of transiency and anonymity.” Studies of Seattle and Chicago yielded similar results. In his 1955 district-by-district survey of London, Peter Sainsbury found that “social isolation” was a more important factor than poverty in determining high-risk areas. In the poor but close-knit working-class sections of London’s East End, the rate was far lower than in prosperous suburbs like Bloomsbury, whose comfortable houses were interspersed with one-room flats, transient hotels, and boardinghouses. He also found high rates around railroad stations and areas settled by immigrants and the newly rich, both of whom, he suggested, faced problems of adjustment. Twenty-seven percent of London suicides had been living alone, while only 7 percent of the general population lived alone. (One cannot, of course, conclude from these results whether suicidal people are drawn to living in lodging houses or whether living in lodging houses drives people to suicide.)
In the nineteenth century, differing rates among countries were often attributed to climate. (As late as 1930, San Diego’s high rate was blamed on “too much sunshine”; more likely the real culprit, as in Florida, was the concentration of elderly people.) These days climate’s effect is said to be negligible—studies by psychiatrist Alex Pokorny in the 1960s exploring the relationship between suicide and temper
ature, wind speed, barometric pressure, relative humidity, and seven other meteorological variables found no significant effect. Time of year, however, plays a role. Although Ishmael, in Herman Melville’s Moby-Dick, described suicidal depression as “a damp, drizzly November in my soul,” T. S. Eliot was a more accurate emotional weatherman: for suicides, April is the cruelest month, its rate some 12 percent above the average for the rest of the year. In November, in fact, the rate is near its nadir. The winter months generally have the lowest rates, and contrary to conventional wisdom, there is no increase around Christmas, New Year’s, or any other major holiday, although a British study found an increase in attempts on Valentine’s Day. Perhaps the rate rises in the spring and early summer because a person’s despair may be heightened by the regeneration around him. “A suicidal depression is a kind of spiritual winter, frozen, sterile, unmoving,” wrote A. Alvarez. “The richer, softer, and more delectable nature becomes, the deeper that internal winter seems, and the wider and more intolerable the abyss which separates the inner world from the outer. Thus suicide becomes a natural reaction to an unnatural condition.” (In Girl, Interrupted, a memoir of her stay in a psychiatric hospital, Susanna Kaysen put it more drily: “It was a spring day, the sort that gives people hope: all soft winds and delicate smells of warm earth. Suicide weather.”) More than two thousand years ago, Hippocrates observed that melancholia was more likely to occur in spring and autumn; contemporary research has found that while many depressive episodes begin in winter, they reach their greatest intensity in spring, with a smaller, secondary peak in the fall. This variation may have biological roots, as there are pronounced seasonal fluctuations in neurotransmitter levels (including serotonin), as well as in certain hormonal activity, which can cause disruptions in mood, energy level, sleep patterns, and behavior.
Ever since 1833, when M. A. Guerry examined 6,587 French suicides and found that a disproportionate number took place on the first day of the workweek, Monday has been the most popular day for suicide—perhaps because people are returning to the “real world” of school and jobs after the exhilaration of the weekend. The beginning of a new week may seem to promise a new beginning, a rebirth; when it turns out to be no different it can be depressing, a dynamic reflected in popular songs such as “Blue Monday” and “Stormy Monday.” (In Guerry’s time, when the work week lasted six days, Sunday was the least popular day for suicide; today, Saturday is.) Time of day? Though it is commonly assumed that most suicides take place in the dark recesses of the night, they are more likely to occur in the morning, which may constitute a sort of miniature version of spring: The world is getting up and starting anew—why can’t I? This pattern, too, may be driven by chemistry: most depression is circadian, and depressed people commonly feel especially anxious on waking.
Conventional wisdom has long held that police, doctors, and dentists kill themselves at abnormally high rates. “If a person works in an occupation which brings him in close contact with death and provides him with convenient means to end his own life, suicide poses a greater danger than in more innocuous professions,” wrote the authors of Traitor Within: Our Suicide Problem, in 1961, noting that executioners, whose careers are devoted to killing others, also appeared to have a high rate of killing themselves. Early studies of suicide by occupation were confounded, however, by demographic variables, including age, gender, and marital status, all of which affect suicide rates independently. Sociologist Steven Stack points out, for instance, that suicide rates for elementary-school teachers are 44 percent lower than for the working-age population in general, but when one controls for gender—the majority of elementary-school teachers being women and women having a much lower rate than men—there is no significant difference. In his 2001 study, “Occupation and Suicide,” Stack controlled for such factors and found health professionals to be at highest risk: dentists topped the list with a rate 5.4 times higher than expected, followed by physicians and nurses. (“Dentists suffer from relatively low status within the medical profession and have strained relationships with their clients—few people enjoy going to the dentist,” Stack has suggested.) Mathematicians, scientists, artists, and social workers also appear to be at increased risk, while police have a rate only slightly higher than expected, when compared to other working-age men. (Executioner was not among the thirty-two occupations considered by Stack.)
Other researchers have parsed the medical field still further to find that surgeons, who may feel directly responsible for the life and death of their patients, tend to have high rates, while obstetricians, pediatricians, and radiologists have lower ones. Psychiatrists may have the highest rate of any medical specialty—six times that of the general population, according to some studies. Estimating that one in three psychiatrists suffers from depression—three times the rate in the general population—the authors of a study of psychiatrists and suicide suggest that the field attracts troubled people seeking to understand their own problems. Noting that eight of Freud’s closest disciples had killed themselves, the neurologist Walter Jackson Freeman, who, as the gung ho promoter of prefrontal lobotomy in this country no doubt had some complicated reasons for his own career path, called suicide “a vocational hazard for the psychiatrist.”
To account for the elevated rate of physician suicide in general, experts point to the high stress level of the work and the tendency of doctors to keep their feelings inside. The type of personality often attracted to the field of medicine, they say, may be especially vulnerable. “It draws workaholics, overly conscientious people who take failure poorly, and idealists, who are frequently disappointed during their careers,” psychiatrist Robert Litman has observed. In addition, physician suicide is encouraged by the ready availability of lethal drugs and the knowledge of how to use them. (More than half of physician suicides overdose, while only 12 percent use guns—numbers that are nearly reversed in the general population.) Physicians, who have a hard enough time recognizing depression in their patients, are slow to recognize depression in themselves and, even when they do, may be reluctant to seek help—hardly surprising given that medical licensing boards forbid them from practicing if they are being treated for any psychiatric condition. The suicide rate is especially high among female physicians, lending support to research suggesting that women who enter male-dominated professions, such as female chemists and soldiers, may be at increased risk.
The challenges of assimilating into an entrenched culture may also play a part in African-American suicide. For many years it was believed that suicide was, as one researcher put it, “a white solution to white problems.” Indeed, despite facing poverty, violence, and two hundred years of oppression, blacks in this country have historically had a suicide rate about half that of whites. Attempts to explain this were based on Durkheim’s suggestion that the greater a person’s status, the greater the potential fall and the greater the chance of suicide. Suicide, it was said, was a luxury blacks couldn’t afford because they were too busy trying to survive. “Black folks have so many problems they don’t even have time to think about committing suicide,” went the old saw. (Comedian Dick Gregory quipped, “You can’t kill yourself by jumping out of the basement.”) A more psychologically sophisticated explanation for the low black rate derived from Freud’s belief that suicide is the result of murderous impulses toward a lost love object turned inward. In dealing with frustration and aggression, social groups were said to turn either to homicide or to suicide, and rates varied inversely in a given community. Sociologists pointed to the high homicide and low suicide rates among American blacks (as well as to the low homicide and high suicide rates in Sweden and Denmark) as evidence. The generally held—if rarely expressed—opinion was that blacks killed other people while whites killed themselves.
Although the black homicide rate is indeed high—seven to ten times higher than that of whites—suicide is also a significant problem for blacks, particularly among young males. From 1980 to 1995, while the rate for white males age ten to ni
neteen increased only slightly, the rate for young black males more than doubled. The increase was especially precipitous—233 percent—among blacks age ten to fourteen. (Like the overall adolescent rate, the youthful black rate plateaued and dipped in the midnineties; by 1998, the rate had subsided to what it had been in the early eighties.) Throughout this time, the rate of elderly blacks has remained low, about one-third that of whites, putting in sharp relief the distinctive age pattern of black suicide. The rate peaks in youth (47 percent of black suicides occur among ages twenty to thirty-four, although this group comprises only 22 percent of the black population), then levels off after age thirty-five while the white rate rises. Why is the young black male rate so high? Why is the elderly black rate so low? Until relatively recently these questions went unexplored.
Herbert Hendin’s study of youth suicide in Harlem in the late sixties was one of the first close looks at African-American suicide. He learned that the rate for black New York males age twenty to twenty-five was higher—in some years twice as high—than that for white males of the same age. This had been true since 1910, when detailed records were first kept in New York City. His findings contradicted conventional thinking on the relationship between suicide and violence. Interviewing young black men and women who had made serious suicide attempts, he found a direct relationship, not an inverse one, between suicide and homicide. Almost all had a history of violence in their childhoods—fathers who were physically violent or who died violent deaths, mothers who were abusive—and violence became a part of their lives. They had often thought of killing someone else—sometimes it didn’t seem to matter whom—before they attempted to kill themselves. “Many of these subjects came to life only through acts or fantasies of violence,” wrote Hendin. “In merely talking of past fights or brutality they became far more animated than usual. They see living itself as an act of violence, and regard death as the only way to control their rage.”
November of the Soul Page 37