by Dan Carlin
There was also a move to find someone to blame for the terrible situation people found themselves in. In Europe during the Middle Ages, the people who served as the scapegoats were the Jews. What happened to the Jewish population in the plague era is probably second only in its horror to the Holocaust.* Jews were accused of poisoning wells to make people sick and thereby take over the Christian world. Victims of the disease itself were also blamed for spreading the disease. People suspected of witchcraft or sorcery were also targets.
At the time the plague struck in the fourteenth century, the population of England was six million souls, what many experts consider to be a maximum carrying capacity for that era. This number was reduced to two million in just a few years. The population would not rebound until the 1700s—more than three hundred years later.
There were subsequent outbreaks of the Black Death, the Great Pestilence, or the Great Mortality, as it was variously called, every couple of years, as though it were returning to claim the lives it had missed the previous go-through. It kept the recovery from really taking hold, because it was only natural for people to assume and plan for the worst.
We can’t know how many people in all died. While estimates put the figure at 75 million, countless out-of-the-way farms and towns and even cities may not be included in the final toll. With the total population of western Europe at the time at a bit more than 150 million, this means that about half of the entire population of just that area was wiped out. (A similar percentage today would see more than 300 million people die in Europe alone.)*
The ramifications were extensive. In our modern world—which has seen nothing but exponential population growth to such a degree that it may be threatening the planet’s ecosystem—the idea of populations collapsing is hard to imagine.* Once again, only in tales of science fiction are themes explored that deal with things like nature reclaiming the cultivated land from humans because the humans have died (as has happened in Chernobyl, for example, post–nuclear meltdown—wildlife and wild vegetation have repopulated the wasted site). Yet that’s what was going on in areas hit hard by the plague.
For example, before the pandemic in western Europe, almost no land was available for new farms. That all changed when so many people perished in such a relatively short time. With something like 40 percent of the population gone, peasants who had previously owned nothing of their own moved onto land and into houses that had belonged to the now deceased. Many fields that had once been forests and had been laboriously cultivated now reverted to their natural state. To modern eyes accustomed to the environment succumbing to increasing human encroachment, it seems perhaps somewhat heartening to see a sort of counterattack by nature, taking back the land.
Before the plague struck, peasants were afraid to protest poor working conditions, but after, all bets were off. To paraphrase Barbara Tuchman, modern man may have been born because of the Black Death. Suddenly, the well-ordered class system—one often defended by those who benefited from it as “divinely ordained”—didn’t matter so much, and ideas of equality and merit-based advancement seeped in where nobility and lineage had previously held sway. Population disasters always prompt questions about balance, and they are usually the sort of questions that are easier to ask about in the context of animal ecosystems rather than human ones. Recently, for example, an idea was floated that, due to the incredible loss of life during the Mongol conquests in the thirteenth century, Genghis Khan may have shrunk humankind’s carbon footprint on the planet. Is that a cause to celebrate? If our ability to massively lower the traditional death rate from disease is part of explaining our highest-of-all-time global population level, perhaps we have somehow thrown a monkey wrench into a self-correcting system that was keeping things in balance?*
Perhaps an optimist (or a pessimist?) would point out that nature is always evolving. This battle of man versus microbe is far from over. As nature regularly reminds us, there’s always something new in the microbial pipeline to replace what no longer works.*
In 1918, during a century in which the most modern of societies thought such epidemics were a thing of the past, people got a reminder that even seemingly routine illnesses can be potentially civilization threatening under the right conditions. A malady that would be dubbed the Spanish Flu struck while the devastating First World War was raging, and soon its death toll greatly surpassed that of the war’s.*
Perhaps one of the most astonishing things about this flu was that at the time it hit, humanity had made great strides in medicine. But when American service personnel started showing symptoms, the experts were stumped. The author John Barry describes in The Great Influenza how sailors mysteriously began bleeding from their noses and ears, while others coughed blood. “Some coughed so hard the autopsies would later show they had torn apart abdominal muscles and rib cartilage,” Barry writes. Many were delirious or complained of severe headaches “as if someone were hammering a wedge into their skulls just behind the eyes” or “body aches so intense they felt like bones breaking.” Some of the men’s skin turned strange colors, from “just a tinge of blue around their lips or fingertips” to skin “so dark one could not tell easily if they were Caucasian or Negro.”
A couple of months before the appearance of these extraordinary symptoms, autopsies of crewmen from a British ship who had died after experiencing similar trials showed “their lungs had resembled those of men who had died from poison gas or pneumonic plague.”
More alarming was the speed and scope of the spreading, Barry writes, despite efforts to isolate and contain those who hadn’t even shown symptoms but had merely been exposed: “Four days after that Boston detachment arrived, nineteen sailors in Philadelphia were hospitalized. . . . Despite their immediate isolation and that of everyone with whom they had had contact, eighty-seven sailors were hospitalized the next day . . . two days later, six hundred more were hospitalized with this strange disease. The hospital ran out of empty beds, and hospital staff began falling ill.” As the sick overwhelmed the facility, officials began sending new patients to civilian hospitals, while military personnel continued moving among bases around the country, exposing ever more people.
What began in Philadelphia—at least in its most dangerous form—quickly advanced. There was still an international war on, and modern transportation had made great strides, so the virus could get from place to place at a far greater pace than any previous pandemic could. The collision of this outbreak with this first period of true globalization was devastating.* At its height, whole cities in the United States were virtually shut down, as areas where human beings congregated were closed to prevent people from transmitting the illness.* People stayed home from school and work rather than risk exposure, and the gears of society in some places seemed imperiled by the justifiable fear of getting sick.* By the time it receded in 1920, modern epidemiologists estimate that the flu had killed somewhere between fifty and one hundred million people; “roughly half of those who died were young men and women in the prime of their life, in their twenties and thirties,” Barry writes. “If the upper estimate of the death toll is true, as many as 8 to 10 percent of young adults then living may have been killed by the virus.”*
The disease wasn’t just remarkable for the number of its victims, but also for the compressed nature of its devastating labors. Although it took two years to come and go, “perhaps two-thirds of the deaths occurred in a period of twenty-four weeks, and more than half of those deaths occurred in even less time, from mid-September to early December 1918.” That amount of damage in that short a period of time is disorienting and potentially destabilizing for a society.
All this happened in an age when we understood a lot about biomedicine. We understood that germs spread disease; we understood how you prevented contact to limit exposure. Indeed, doctors quickly figured out that what was killing sailors in Philadelphia was a strain of influenza, but it was unlike any they had seen before, and nothing they did could contain it. As much as a fifth of the en
tire population of the planet contracted it, and as much as 5 percent died from it. In sheer numbers, it was the deadliest pandemic to hit humankind, but as a percentage of the human population alive at the time, it wasn’t nearly as bad as the Black Death that hit western Europe in the mid-fourteenth century. So, humankind didn’t exactly dodge a bullet—the damage was severe and widespread—but it could have been much, much worse.
It still can be. The same sense of hubris affects us today as affected the generation that was blindsided by the Spanish Influenza. A modern epidemic comparable with the great ones of the past is a thing more akin to science fiction to most people living today rather than something seen as a realistic possibility.* But those who regularly work with infectious diseases and see the Black Death–like damage that something like Ebola or Marburg virus can have on a small scale in isolated communities are all too aware of how a hemorrhagic fever virus in one global region, or an avian flu mutation somewhere else, could remind us that, just like the Titanic, our civilization is not unsinkable. In fact, the culprit need not even be something new.
On September 11, 1978, a British woman named Janet Parker became the last person to officially die of smallpox. It was rather strange that she contracted it at all, since by that time the disease had been almost eradicated worldwide, and on the rare occasions a person was infected it usually was in a last-holdout sort of out-of-the-way location like the rural subtropics. Parker, however, was infected in the modern city of Birmingham in the Midlands of the United Kingdom. She worked in a room over a lab that contained samples of this insanely prolific killer and is thought to have somehow contracted it from there. She had been vaccinated against smallpox during her lifetime, but not recently enough.*
In large part because of Parker’s infection, the World Health Organization ordered that all samples of the disease in existence be turned over to one of two extrasecure sites, where two live smallpox samples still reside—one in the United States, and one in Russia.* A debate has been ongoing since then over whether those samples should be destroyed so that the threat will be forever removed. Both the US and Russian governments have opposed the idea, saying the samples are important to study and might be needed someday.*
In a New York Times op-ed piece in 2011, Kathleen Sebelius, then the secretary of the US Department of Health and Human Services, laid out some chilling reasons the government believes that samples should be retained, including that other nations may have surreptitiously kept their own samples, or that mislabeled or forgotten samples might be lying around somewhere: “Although keeping the samples may carry a minuscule risk, both the United States and Russia believe the dangers of destroying them now are far greater. . . . The global public health community assumes that all nations acted in good faith; however, no one has ever attempted to verify or validate compliance with the WHO request [to destroy live smallpox samples]. It is quite possible that undisclosed or forgotten stocks exist.”
In fact, on more than one occasion, we’ve found some.* Let’s hope no terrorists ever do.
The traditional Black Death–type pathogen has already been used in attempts at weaponization. One of the old theories about the Black Death was that the Mongols brought it to Europe and spread it while assaulting urban centers—the claim was that they launched infected corpses over the city walls. While that may or may not be true, there’s no doubt that in the 1930s and 1940s the Japanese military deliberately introduced the plague into fleas and then dropped the fleas over Chinese cities.
Bacteriological warfare has come a long way since then. In fact, the concept of airborne pathogens being used against a population is a more frightening and potentially more destructive threat than any other weapon in global arsenals. Nuclear and chemical armaments are terrible, but both have limitations to their lethality. With the ability for a killer pathogen to spread from person to person* and to continue to kill for generations (or forever), a man-made plague might be worse than anything nature’s previously thrown at us.*
What about new diseases? New strains of flu are jumping from pigs and poultry and birds to humans just about every year. The Spanish Flu was unknown until it showed up. AIDS was unknown until it showed up.* There are also diseases we’ve “extinguished” that can reignite their danger due to mutations on their part or the diminishing effectiveness of counterresponses like vaccines, treatments, antibiotics, or antidotes on ours.
While it is natural to focus on the direct effects of a mass death situation resulting from a pandemic, oftentimes the ripple effects are just as potentially influential. Reading today’s expert literature makes it clear that modern authorities are as worried about the dangers associated with fear, uncertainty, and irrationality on the part of the public as they are about the actual direct dangers of any future pathogen.* History would suggest they are probably right to be so worried.
Even a slow-moving tragedy like the AIDS outbreak threatened all sorts of panic, backlash, and prejudicial responses when it first took hold in the public consciousness. As bad at it was in the 1980s, imagine how much more acute it would likely have been had AIDS behaved more like cholera or smallpox—infecting people through the air they breathed or the water they drank, and killing them in a matter of days. It’s hard to imagine a human society acting rationally or humanely if mortality levels began reaching catastrophic levels. In the past, societies have been reshaped and at times have nearly crumbled under the weight of a pandemic. It’s possible that, facing mortality rates of 50, 60, or 70 percent—as people who lived through the Black Death did—we might do as they did: turn to religion, change the social structure, blame unpopular minorities and groups, or abandon previous belief systems. We can learn from how people in other eras responded to a catastrophic situation, and we can ask ourselves: With all our modern technology and science and medical knowledge, how would we respond? How much would our better understanding of the science behind epidemics help blunt the normal fear and panic that seem to be the normal response to such a threat?
While we have infinitely more tools in the medical toolbox to combat any modern disease, the modern world also confers some advantages to pathogens. After all, we live in a world with a far greater level of interconnectivity than in any past era. Contagion can now spread on a far greater scale and at a far greater speed than ever before. A pandemic-level disease could travel around the world before experts even knew there was a problem.
What’s the likelihood that humanity has already experienced the worst plague it will ever encounter? In the famous science fiction classic The War of the Worlds, author H. G. Wells has the alien would-be conquerors defeated ultimately by Earth’s pathogens. Let’s hope those same planetary defense mechanisms don’t get us first.
Chapter 7
The Quick and the Dead
Unless humankind can break patterns of collective behavior that are older than history itself, we can expect to have a full-scale nuclear war at some point in our future. The great regional or global geopolitical rivals at any given time and place have been squaring off with each other since the first cities arose in Mesopotamia, and it seems unrealistic to imagine that this has forever ended. Despite intermittent peaceful eras, there have always been wars. But the next Total War will be the first one in which both sides possess weapons powerful enough to destroy civilization—and efficient enough to do it in an afternoon.
On October 30, 1961, a specially modified Soviet aircraft dropped a nearly 60,000-pound, 50-megaton thermonuclear bomb over a test site in the Arctic. Known then as a “hydrogen bomb,” it was by far more powerful than any weapon before or since. Part of the intent of the test was to demonstrate to the United States in no uncertain terms the destructive capabilities possessed by the Soviet Union. It wildly exceeded those goals—it sent a message to future ages.
In fact, the Soviet Union had wanted to use an almost incomprehensibly powerful 100-megaton weapon instead,* but one of the physicists on the project, the future dissident and peace activist Andrei Sakha
rov, talked the Soviet leader Nikita Khrushchev out of it. Sakharov was worried enough about what a 50-megaton bomb would do.
The largest nuclear weapon ever used up to that point had been a 15-megaton hydrogen bomb, detonated as part of the 1954 Castle Bravo test conducted by the United States in the Pacific. There were serious questions about possible runaway chain reactions that such a huge blast might trigger, and whether or not the radioactive fallout would affect the entire planet. (What “affect” meant wasn’t exactly clear, either.) There was a lot that was still unproved and theoretical about very large bombs, and the Castle Bravo test highlighted the inexact science of bomb power estimation. For a start, it was not supposed to have been that large—the bomb surprised everyone and turned out to be more than twice as powerful as expected. In fact, there was fear among a few scientists that the air itself might catch fire.
The gigantic Soviet bomb—known as “Tsar Bomba” in the West—was described like this by the historian John Lewis Gaddis in The Cold War: “[It was] the single largest blast human beings had ever detonated—or have since—on the planet. The flash was visible 600 miles away. The fireball,” now quoting someone who saw it, “was powerful and arrogant like Jupiter. It seemed to suck the whole earth into it.”
Gaddis continues: “The mushroom cloud rose 40 miles into the stratosphere. The island over which the explosion took place was literally leveled, not only of snow but also of rocks, so that it looked to one observer like an immense skating rink. . . . One estimate calculated, on the basis of this test, that if the originally requested 100 megaton bomb had been used instead, the resulting firestorm would have engulfed an area the size of the state of Maryland.” It also would likely have killed the crew of any aircraft that dropped the bomb.*