In Chicago, where the health commissioner claimed he’d “do nothing to interfere with the morale of the community” as “fear kills more than disease,” the death rate climbed from 15 percent of the afflicted to 40 percent that same month.44 As someone who is afraid of literally everything, I can assure you that fear does not have such a successful kill rate. In Buffalo, the acting city health commissioner claimed that “it was no uncommon matter to find persons who had waited two or three days after having repeatedly phoned or summoned physicians, suffering and dying because every physician was worked beyond human endurance.”45 Sophomores in medical school were called upon—well before they were properly trained physicians—to care for the afflicted. In New York City, 30,736 people died in September and October.46 One doctor, at New York’s Presbyterian Hospital, recalled going into the wards each morning and finding that every single patient in critical care had died. Every morning.47
All the doctors who worked through this epidemic and everyone who volunteered? They are heroes, too. The fact that they were not all awarded presidential medals for their service is an oversight.
Eventually, without clear guidance on what to do to effectively combat the disease and death, morale eroded. When people, who by this time were surrounded by clear evidence of the danger, attempted to find helpful information, they were met, over and over, with the message that everything was fine. Even when newspapers did offer truthful information, people were now unsure whether they could trust anything that was printed.
That October—which was the deadliest month in U.S. history, and that takes into account periods like, say, the Civil War—195,000 people died of the Spanish flu.48 If you are in a house with anyone else right now, you can put this in perspective by shouting, “One us would be dead, probably!” Your family members love it when you do that. Now imagine Dr. Krusen in the background desperately exclaiming, as he did, “Don’t get frightened or panic-stricken over exaggerated reports.”49 The newspapers printed admonitions that read, “Don’t Get Scared.” After the children’s little bird poem, this may be the second-scariest, supposedly anodyne phrase in this chapter.
People were terrified.
In response, they began to behave like terrified people. They made placards that read, “Spitting Equals Death,” and culprits began to be arrested for spitting.50 I mean, that was not wrong (don’t spit on people if you have the flu), but arresting people for bodily functions is not a way to calm panic, nor was it especially effective. Face masks became mandatory for civil servants and those suffering from influenza, though the Washington Times claimed that “it [was] not thought the use of masks will become general, and health officials inclined to doubt their value for general use as a preventative measure.”51 This announcement was far less prominent in the paper than the anticipatory exclamation “Thanksgiving Is Coming!” which was illustrated with a picture of a soldier running after a turkey who is wearing a fez, like the guy in the other kind of Turkey. Thanksgiving would not be coming for over a month.
Soon masks were in widespread use by ordinary citizens, but, sadly, officials were correct in their predictions that they didn’t seem especially effective in combatting the disease.52 People began to rely on outdated folk remedies like onions—just as they had in the times of the bubonic plague.53 One shop merchant claimed he sold more quinine in one day than he had in the past three years, which would have been awesome and useful if only this had been an epidemic of malaria. Beyond quinine,54 advertisements for influenza remedies included:
• Use Oil of Hyomei. Bathe your breathing organs with antiseptic balsam.
• Munyon’s Paw Paw Pills for influenza insurance.
• Sick with influenza? Use Ely’s Cream Balm. No more snuffling. No struggling for breath.55
(Paw paw pills sound like something that should treat a cat’s ringworm.)
And it wasn’t just superstitious people who took to dubious medical treatments. Doctors recommended consuming more alcohol—whiskey, half a bottle of wine a day, and a glass of port before bed.56 Doctors started filling prescriptions for whiskey, which could be procured from drugstores in Philadelphia.57 In most cases drinking doesn’t solve anything, but in this particular instance not drinking wasn’t helping—nothing was—so it may have been an understandable response.
In Philadelphia most popular gathering places were shut down, though not, officials stressed, as a public safety measure. However, suppressing information about the reasons for these closures led to great inefficiencies and much confusion. For instance, rather than telling people they should really avoid cramming together on streetcars, especially if they were feeling ill, officials just issued a limitation on the number of people who could be on streetcars at any time. A man in England claimed that his movie theater was safe—and he was believed—because he had a special aerating machine. He may have had such a machine, and he may very well have believed it was effective, but it was not.58 Bars were the last establishments to close in Philadelphia,59 but the Savoy Hotel in London did a brisk business with its new Corpse Reviver cocktail of whiskey and rum. (It was slightly different from the traditional hangover cure, but both are similar in their abilities to make people feel a little less like they are dying.)60
People began to take matters into their own, often panicked and sometimes inebriated, hands. A health inspector in San Francisco shot a man who refused to wear a mask, and a man in Chicago is said to have screamed, “I’ll cure them my own way!” as he slashed open the throats of his family members.61 In London another man similarly cut the throats of his wife and two daughters when he recognized that he’d contracted the disease and that they would be left poor after his death.62
Spanish influenza came to be referred to as “the plague,” as in the bubonic plague of the fourteenth century. That can’t be surprising. Despite the six hundred years that had passed, every aspect of the public disarray and nightmarish nature of this time was more reminiscent of the black death than of any other outbreak. An internal American Red Cross memo wrote that “a fear and panic of the influenza, akin to the terror of the middle ages regarding the Black Plague, has been prevalent in many parts of the country.”63 A Red Cross report in rural Kentucky explained that people had begun starving to death because they wouldn’t venture outside for food.
By November it seemed that people had given up attempting to fight. They stayed at home, terrified. While earlier during the outbreak in New York, Lillian Wald, the nurse and humanitarian, spotted “dignified and discerning women [who] stood on the steps at Altman’s and Tiffany’s Fifth Avenue shops and accosted passers-by” hoping to get funds to help fight the epidemic, now no one seemed to want to go anywhere.64 Elizabeth Martin, the head of Emergency Aid in Philadelphia, furiously reported that “hundreds of women who are content to sit back … had delightful dreams of themselves in the roles of angels of mercy, had the unfathomable vanity to imagine that they were capable of great spirit of sacrifice. Nothing seems to rouse them now. They have been told that there are families in which every member is ill, in which the children are actually starving because there is no one to give them food. The death rate is so high and they still hold back.”65 Dr. Victor Vaughn remarked: “If the epidemic continues its mathematical rate of acceleration, civilization could easily disappear from the face of the earth.”66 But, miraculously, it didn’t.
We don’t know precisely why the disease stopped killing people—scientists have various theories—but it did. The most common one is that it simply killed too many hosts.
It was only after it had abated—and after the war had ended—that people began writing about it. The American Journal of Medicine stated on December 28, 1918:
The year 1918 has gone: a year momentous as the termination of the most cruel war in the annals of the human race; a year which marked, the end at least for a time, of man’s destruction of man; unfortunately a year in which developed a most fatal infectious disease causing the death of hundreds of thousands of human be
ings. Medical science for four and one-half years devoted itself to putting men on the firing line and keeping them there. Now it must turn with its whole might to combating the greatest enemy of all—infectious disease.67
But there was little to combat. While a milder third wave of the disease appeared the next winter—and there would be periodic outbreaks through the 1920s—the deadliest period was over.
The Spanish flu is estimated to have killed somewhere between 25 million and 100 million people over the world. Around 675,000 Americans are thought to have died of it. That’s more than died in the Civil War, and the Civil War went on for four years.
Scientists today are experimenting with reverse genetics to try to re-create the flu, using virus that was preserved in frozen corpses. The hope is that they can develop a vaccine that would stop the disease if it emerges again. However, that’s extremely challenging given the speed with which the virus has been known to mutate. They haven’t succeeded yet.68
So the disease is still out there, perhaps lurking under the ice somewhere, and there’s no cure.
If there’s another outbreak, mankind might not be so lucky as to survive. But we can at least avoid being so stupidly duplicitous.
John Barry, in The Great Influenza, writes: “Those in authority must retain the public’s trust. The way to do that is to distort nothing, to put the best face on nothing, to try to manipulate no one … Leadership must make whatever horror exists concrete. Only then will people be able to break it apart.”69
There are certainly better ways for government officials to combat public health crises. Maybe at minimum they should subsidize funerals so no one has to bury their children in macaroni boxes. Planning for emergency response teams and finding volunteers early might also be a good strategy. Government leaders failed here, in just about every way they could fail, but, as terrible as Woodrow Wilson was, I am willing to admit he did have some other issues going on at the time.
It’s the journalists I am most disappointed in. Perhaps because, ideally, journalists safeguard the public by telling them what they need to know. There’s a trope in movies and books where a reporter at some point screams something like, “We’re journalists! We tell the truth!” Sometimes that translates into some silly stuff, like honestly reporting to the public that Kim Kardashian has cellulite. But at its best, say, during Watergate or during the clergy abuse scandal, reporting the truth means that journalists act as public watchdogs. Often they defend the common man despite the wishes of those in power. They tell the truth, even when it would be easier to lie—when the government wants them to lie. That’s the highest possible aim of the fourth estate. Sadly, Senator Hiram Johnson’s 1917 claim that “the first casualty when the war comes is truth” proved accurate in this instance. In 1918, during the Spanish flu outbreak, owing to whatever misguided intentions, the fourth estate failed.
And although better coverage of the outbreak’s evolution in the press couldn’t have stopped the influenza virus, a single newspaper headline in Philadelphia saying “Don’t Go to Any Parades; for the Love of God Cancel Your Stupid Parade” could have saved hundreds of lives. It would have done a lot more than those telling people, “Don’t Get Scared!”
Telling people that things are fine is not the same as making them fine.
This failure is in the past. Journalists and editors had their reasons. Risking jail time is no joke. But learning from this breakdown in truth-telling is important because the fourth estate can’t fail again. We are fortunate today to have organizations like the Centers for Disease Control and Prevention and the World Health Organization that track how diseases are progressing and report these findings. In the event of an outbreak similar to the Spanish flu, they will be wonderful resources. I hope we’ll be similarly lucky to have journalists who will be able to share necessary information with the public. The public is at its strongest when it is well informed. Despite Lippmann’s claims to the contrary, we are smart, and we are good, and we are always stronger when we work together. If there is a next time, it would be very much to our benefit to remember that.
Encephalitis Lethargica
Only at times the curtain of the
pupils lifts, quietly—an image enters
in, rushes down through the tensed,
arrested muscles, plunges into the
heart and is gone.
—RAINIER MARIA RILKE, TRANSLATED BY STEPHEN MITCHEL
Here is a graphic showing the pace of medical advances for most of our known history:
It is a line showing conditions improving really, really slowly until 1900. It is my totally accurate and scientific portrayal of every single step forward and setback. I had to look up how to draw in Microsoft Word, and it took me probably four minutes to learn how to make this line, so, please, take a second to appreciate it.
Here is what happens to medical progress in the twentieth century:
“Wow! Those are some finely drawn lines,” you are probably saying. “It certainly seems as if we have made rapid advances in the last hundred years! But I want to see an even steeper line. Perhaps illustrating medical methods and treatments becoming radically more advanced within a single human lifetime. A fifty-year period or so. Accomplished drawer of lines—has such an event ever occurred?”
Do I have one such example for you! It is encephalitis lethargica—often abbreviated to EL—which raged from 1916 through the late 1920s. If you managed to get through the Spanish flu chapter and were only modestly terrified, well, you’re a strange human being, but settle in and prepare to have your socks scared off. EL gives Spanish flu a run for its money in terms of “terrifying uncured diseases that we have forgotten about in less than one hundred years.”
During the period in the teens and 1920s when EL was rampant, there were a million cases of the disease and more than five hundred thousand deaths worldwide.1 Those who survived were often trapped inside their own bodies. It’s a bit of a simplification, but you could think of it as a disease that turned you into a human statue, as in A Winter’s Tale. For a slightly more clinical explanation, H. F. Smith, who worked for the U.S. Public Health Service, described the disease in 1921 as
an epidemic syndrome characterized in most instances by a gradual onset with headache, vertigo, disturbances of vision, ocular palsies [inability of individuals to move their eyes normally], changes in speech, dysphagia [difficulty swallowing], marked asthenia [weakness], fever (usually of a low grade), obstinate constipation, incontinence of urine [loss of bladder control], a peculiar mask-like expression of the face, and a state of lethargy which gradually develops in the majority of the recognized cases into a coma that is more or less profound.2
But that is, perhaps, too unemotional a definition, and one that jams too many symptoms into a single sentence, to make the horrors of the disease seem real.
Let’s begin in 1917 when a young scientist named Constantin von Economo was working at the Wagner-Jauregg clinic in Vienna. He was in the air force during World War I and became the first person in Vienna to be granted an international aviation license. He loved flying hot-air balloons and airplanes. The fact that someone hot-air ballooned as their hobby is already supercool, but von Economo also loved the study of science; he obtained a degree in engineering as well as one in medicine. He was also well read, particularly enjoying works like Goethe’s Faust and Homer’s Odyssey. He was a man who was passionately interested in the world and the way it worked. Oh, he was also a baron and married to the daughter of a Greek prince. It’s baffling that there isn’t some sort of historical Fifty Shades of Grey based on this guy. When he decided to turn his attention away from flying to psychiatry in 1917, his wife worried he might miss aviation. He replied, “No, there is no longer so much new to do there.”3
He poses like an advertisement for “adventure.”
Fortunately, there was a great deal to study about mental conditions during this period; in that same year Freud wrote his Introductory Lectures on Psycho-Analysis. It
is kind of odd that the study of certain conditions is considered “fashionable” in some time periods, but if ever studying mental issues was trendy, this was the time and Vienna was the place.
Von Economo soon encountered a wave of patients at the Wagner-Jauregg clinic who all seemed to be suffering from similar symptoms. They came in because their husbands, wives, or children were concerned that they had begun falling asleep inappropriately, for instance, at the dinner table as they were chewing their food or during consultations with their doctor. Patients who fell asleep in front of von Economo would be able to be roused from their stupor, would squint at him briefly, and then go back to sleep.4
If you are tempted to exclaim, “That sounds like me on a Monday morning! I hate Mondays!” then:
1. I assume you are Jim Davis. Congratulations on Garfield’s success.
2. No, it does not sound like you. You can chew your food without physically collapsing.
There were other symptoms that affected patients in various ways. None resembled needing a nap from time to time. One patient couldn’t swallow and regurgitated food through her nostrils. Another couldn’t look to the left. Others couldn’t control their facial movements. One hallucinated that a group of people surrounded him. Some suffered from “forced laughter.”5 All seemed to be very, very tired, which led von Economo to conclude, “We are dealing with a sleeping sickness.”6
Get Well Soon Page 19