Prosper remembered the outbreak in his own way, mourning Estelle’s losses and some of a different sort. He showed us a treasured book, like a family bible—except it was a botanical field guide—on the endpapers of which he had written a list of names: Apollo, Cassandra, Afrodita, Ulises, Orfeo, and almost twenty others. They were gorillas, an entire group that he had known well, that he had tracked daily and observed lovingly at Lossi. Cassandra was his favorite, Prosper said. Apollo was the silverback. “Sont tous disparus en deux-mille trois,” he said. All of them, gone in the 2003 outbreak. In fact, though, they hadn’t entirely disparus: He and other trackers had followed the group’s final trail and found six gorilla carcasses along the way. He didn’t say which six. Cassandra, dead with others in a fly-blown pile? It was very hard, he said. He had lost his gorilla family, and also members of his human family.
For a long time Prosper stood holding the book, opened for us to see those names. He comprehended emotionally what the scientists who study zoonoses know from their careful observations, their models, their data. People and gorillas, horses and duikers and pigs, monkeys and chimps and bats and viruses: We’re all in this together.
III
EVERYTHING COMES
FROM SOMEWHERE
23
Ronald Ross came west from India, in 1874, at age seventeen, to study medicine at St. Bartholomew’s Hospital in London. He came to the study of malaria somewhat later.
Ross was a true son of the empire. His father, General Campbell Ross, a Scottish officer with roots in the Highlands, had served in the British Indian Army through the Sepoy Rebellion and fought in fierce battles against the hill tribes. Ronald had been “home” to England before, having endured a boarding school near Southampton. He fancied the idea of becoming a poet, or a painter, or maybe a mathematician; but he was the eldest of ten children, with all attendant pressures, and his father had decided he should enter the Indian Medical Service (IMS). After a lackluster five years at St. Bartholomew’s, Ross flunked the IMS qualifying exam, an inauspicious start for an eventual Nobel laureate in medicine. The two facts from his youth that do seem to have augured well and truly are that he won a schoolboy prize for mathematics and, during medical training, he diagnosed a woman as suffering from malaria. It was an unusual diagnosis, malaria being virtually unknown in England, even amid the Essex marshes where this woman lived. History doesn’t record whether Ross’s diagnosis was right because he scared her with talk of the deadly disease and she disappeared, presumably back into lowland Essex. Anyway, Ross tried the IMS exam again after a year, squeaked through, and was posted to duty in Madras. That’s where he started noticing mosquitoes. They annoyed him because they were so abundant in his bungalow.
Ross didn’t bloom early as a medical detective. He dabbled and dawdled for years, distracted with the enthusiasms of the polymath. He wrote poetry, plays, music, bad novels, and what he hoped were groundbreaking mathematical equations. His medical duties at the Madras hospital, which involved treating malarial soldiers with quinine, among other tasks, demanded only about two hours daily, which left him plenty of time for extracurricular noodling. But eventually the extracurriculars included wondering about malaria. What caused it—miasmal vapors, as the traditional view held, or some sort of infectious bug? If a bug, how was that bug transmitted? How could the disease be controlled?
After seven years of unexceptional service he returned to England on furlough, did a course in public health, learned to use a microscope, found a wife, and took her back to India. This time his post was a small hospital in Bangalore. He started looking through his microscope at blood smears from feverish soldiers. He lived an intellectually isolated life, far from scientific societies and fellow researchers, but in 1892 he learned belatedly that a French doctor and microscopist named Alphonse Laveran, working in Algeria and then Rome, had discovered tiny parasitic creatures (now known as protists) in the blood of malaria patients. Those parasites, Laveran argued, caused the disease. During another visit to London, with help from an eminent mentor there, Ross himself saw the “Laveran bodies” in malarial blood and was converted to Laveran’s idea, so far as it went.
Laveran had detected the important truth that malaria is caused by microbes, not by bad air. But that still left unexplained the wider matters of how these microbes reproduced in a human body, and how they passed from one host to another. Were they carried and ingested in water, like the germ causing cholera? Or might they be transmitted in the bite of an insect?
Ronald Ross’s eventual discovery of the mosquito-mediated life cycle of malarial parasites, for which he won his Nobel Prize in 1902, is famous in the annals of disease research and I won’t retell it here. It’s a complicated story, both because the life cycle of the parasites is so amazingly complex and because Ross, himself a complicated man, had so many influences, competitors, enemies, wrong ideas as well as right ones, and distracting disgruntlements. Two salient points are enough to suggest the connections of that story to our subject, zoonoses. First, Ross delineated the life history of malarial parasites not as he found them infecting humans but as he found them infecting birds. Bird malaria is distinct from human malaria but it served as his great analogy. Second, he came to see the disease as a subject for applied mathematics.
24
Numbers can be an important aspect of understanding infectious disease. Take measles. At first glance, it might seem nonmathematical. It’s caused by a paramyxovirus and shows itself as a respiratory infection, usually accompanied by a rash. It comes and it goes. But epidemiologists have recognized that, with measles virus, as with other pathogens, there’s a critical minimum size of the host population, below which it can’t persist indefinitely as an endemic, circulating infection. This is known as the critical community size (CCS), an important parameter in disease dynamics. The critical community size for measles seems to be somewhere around five hundred thousand people. That number reflects characteristics specific to the disease, such as the transmission efficiency of the virus, its virulence (as measured by the case fatality rate), and the fact that one-time exposure confers lifelong immunity. Any isolated community of less than a half million people may be struck by measles occasionally, but in a relatively short time the virus will die out. Why? Because it has consumed its opportunities among susceptible hosts. The adults and older children in the population are nearly all immune, having been previously exposed, and the number of babies born each year is insufficient to allow the virus a permanent circulating presence. When the population exceeds five hundred thousand, on the other hand, there will be a sufficient and continuing supply of vulnerable newborns.
Another crucial aspect of measles is that the virus is not zoonotic. If it were—if it circulated also in animals living near or among human communities—then the question of critical community size would be moot. There wouldn’t be any necessary minimum size of the human population, because the virus could always remain present, nearby, in that other source. But bear in mind that measles, though it doesn’t circulate in nonhuman animal populations, is closely related to viruses that do. Measles belongs to the genus Morbillivirus, which includes canine distemper and rinderpest; its family, Paramyxoviridae, encompasses also Hendra and Nipah. Although measles doesn’t often pass between humans and other animals, its evolutionary lineage speaks of such passage sometime in the past.
Whooping cough, to take another example, has a critical community size that differs slightly from the measles number because it’s a different disease, caused by a microbe with different characteristics: different transmission efficiency, different virulence, different period of infectivity, et cetera. For whooping cough, the CCS seems to be more like two hundred thousand people. Such considerations have become grist for a lot of fancy ecological mathematics.
Daniel Bernoulli, a Dutch-born mathematician from a family of mathematicians, was arguably the first person to apply mathematical analysis to disease dynamics, long before the germ theories of disease (t
here was a gaggle, not just one) became widely accepted. In 1760, while holding a professorship at the University of Basel in Switzerland, Bernoulli produced a paper on smallpox, exploring the costs versus the benefits of universal immunization against that disease. His career was long and eclectic, encompassing mathematical work on a wide range of topics in physics, astronomy, and political economy, from the movement of fluids and the oscillation of strings to the measurement of risk and ideas about insurance. The smallpox study seems almost anomalous amid Bernoulli’s other interests, except that it also entailed the notion of calculating risk. What he showed was that inoculating all citizens with small doses of smallpox material (it wasn’t known to be a virus then, just some sort of infectious stuff) had both risks and benefits, but that the benefits outweighed the risks. On the risk side, there was the fact that artificial inoculation sometimes—though rarely—led to a fatal case of the disease. More usually, inoculation led to immunity. That was an individual benefit from a single action. To gauge the collective benefits from collective action, Bernoulli figured the number of lives that would be saved annually if smallpox were entirely eradicated. His equations revealed that the net result of mass inoculation would be three years and two months of increased lifespan for the average person.
Life expectancy at birth wasn’t high in the late eighteenth century, and those three years and two months represented a sizable increment. But because the real effects of smallpox are not averaged between the people who catch it and the people who don’t, Bernoulli also expressed his results in a more stark and personal way. Among a cohort of 1,300 newborns, he projected, using life-table statistics for all causes of death as available to him at the time, 644 of those babies would survive at least to age twenty-five, if they lived in a society without smallpox. But if smallpox were endemic, only 565 of the same group would reach a twenty-fifth birthday. Health officials and ordinary citizens, imagining themselves among the seventy-nine preventable fatalities, could appreciate the force of Bernoulli’s numerical argument.
Bernoulli’s work, applying mathematics to understand disease, pioneered an approach but didn’t create an immediate trend. Time passed. Almost a century later, the physician John Snow used statistical charts as well as maps to demonstrate which water sources (notably, the infamous Broad Street pump) were infecting the most people during London’s cholera outbreak of 1854. Snow, like Bernoulli, lacked the advantage of knowing what sort of substance or creature (in this case it was Vibrio cholerae, a bacterium) caused the disease he was trying to comprehend and control. His results were remarkable anyway.
Then, in 1906, after Louis Pasteur and Robert Koch and Joseph Lister and others had persuasively established the involvement of microbes in infectious disease, an English doctor named W. H. Hamer made some interesting points about “smouldering” epidemics in a series of lectures to the Royal College of Physicians in London.
Hamer was especially interested in why diseases such as influenza, diphtheria, and measles seem to mount into major outbreaks in a cyclical pattern—rising to a high case count, fading away, rising again after a certain interval of time. What seemed curious was that the interval between outbreaks remained, for a given disease, so constant. The cycle that Hamer plotted for measles in the city of London (population at that time: 5 million) was about eighteen months. Every year and a half came a big measles wave. The logic of such cycles, Hamer suspected, was that an outbreak declined whenever there weren’t enough susceptible (nonimmune) people left in the population to fuel it, and that another outbreak began as soon as new births had supplied a sufficient number of new victims. Furthermore, it wasn’t the sheer number of susceptible individuals that was crucial, but the density of susceptibles multiplied by the density of infectious people. In other words, contact between those two groups is what mattered. Never mind the recovered and immune members of the population; they just represented padding and interference so far as disease propagation was concerned. Continuation of the outbreak depended on the likelihood of encounters between people who were infectious and people who could be infected. This idea became known as the “mass action principle.” It was all about math.
The same year, 1906, a Scottish physician named John Brownlee proposed an alternate view, contrary to Hamer’s. Brownlee worked as a clinician and hospital administrator in Glasgow. For a paper delivered to the Royal Society of Edinburgh, he plotted sharp up-and-down graphs of case numbers, week by week or month by month, from the empirical records of several disease outbreaks—plague in London (1665), measles in Glasgow (1808), cholera in London (1832), scarlet fever in Halifax (1880), influenza in London (1891), and others—and then matched them with smooth rollercoaster curves derived from a certain mathematical equation. The equation expressed Brownlee’s suppositions about what caused the outbreaks to rise and decline, and the good fits against empirical data proved (to him, anyway) that his suppositions were correct. Each epidemic had arisen, he argued, with “the acquisition by an organism of a high grade of infectivity,” a sudden increase of the pathogen’s catchiness or potency, which thereafter decreased again at a high rate. The epidemic’s decline, which was generally not quite as abrupt as its start, resulted from this “loss of infectivity” by the disease-causing organism. The plague bacterium had shot its wad. The measles virus had slowed or weakened. Influenza had turned tame. Malign power had deserted each of them like air going out of a balloon. Don’t waste your time worrying about the number or the density of susceptible people, Brownlee advised. It was “the condition of the germ,” not the character of the human population, that determined the course of the epidemic.
One problem with Brownlee’s nifty schema was that other scientists weren’t quite sure what he meant by “infectivity.” Was that synonymous with transmission efficiency, as measured by the number of transmissions per case? Or synonymous with virulence? Or a combination of both? Another problem was that, whatever he meant by infectivity, Brownlee was wrong to think that its inherent decline accounted for the endings of epidemics.
So said the great malaria man, Ronald Ross, in a 1916 paper presenting his own mathematical approach to epidemics. Ross by that time had received his Nobel Prize, and a knighthood, and had published a magnum opus, The Prevention of Malaria, which in fact dealt with understanding the disease in scientific and historical depth as well as preventing it. Ross recognized that, because of the complexity of the parasite and the tenaciousness of the vectors, malaria probably couldn’t be “extirpated once and forever”—at least not until civilization reached “a much higher state.” Malaria reduction, therefore, would need to be a permanent part of public health campaigns. Ross meanwhile had turned increasingly to his mathematical interests, which included a theory of diseases that was more general than his work on malaria, and a “theory of happenings” that was more general than his theory of diseases. By “happenings” he seems to have meant events of any sort that pass through a population, like gossip, or fear, or microbial infections, affecting individuals sequentially.
He began the 1916 paper by professing surprise that “so little mathematical work should have been done on the subject of epidemics,” and noted without false modesty (or any other kind) that he himself had been the first person to apply a priori mathematical thinking (that is, starting with invented equations, not real-world statistics) to epidemiology. He nodded politely to John Brownlee’s “excellent” work and then proceeded to dismiss it, rejecting Brownlee’s idea about loss of infectivity and offering instead his own theory, supported by his own mathematical analysis. Ross’s theory was that epidemics decline when, and because, the density of susceptible individuals in the population has fallen below a certain threshold. Look and see, he said, how nicely my differential equations fit the same sets of epidemic data that Dr. Brownlee adduced. Brownlee’s hypothetical “loss of infectivity” was unnecessary for explaining the precipitous decline of an epidemic, whether the disease was cholera or plague or influenza or something else. All that was
necessary was the depletion of susceptibles to a critical point—and then, shazam, the case rate fell drastically and the worst was over.
Ross’s a priori approach may have been perilous, at such an early stage of malaria studies, and his attitude a little arrogant, but he produced useful results. His insight about susceptibles has met the test of time, coming down through the decades of theoretical work on infectious diseases to inform modern mathematical modeling. He was right about something else, too: the difficulty of extirpating malaria “once and forever.” Although the control measures he advocated were effective toward reducing malaria in certain locales (Panama, Mauritius), in other places they failed to do much good (Sierra Leone, India) or the results were transitory. For all his honors, for all his mathematical skills, for all his combative ambition and obsessive hard work, Ronald Ross couldn’t conquer malaria, nor even provide a strategy by which such an absolute victory would eventually be won. He may have understood why: because it’s such an intricate disease, deeply entangled with human social and economic considerations as well as ecological ones, and therefore a problem more complicated than even differential calculus can express.
25
When I first wrote about zoonotic diseases, for National Geographic in 2007, I was given to understand that malaria was not one. No, I was told, you’ll want to leave it off your list. Malaria is a vector-borne disease, yes, in that insects carry it from one host to another. But vectors are not hosts; they belong to a different ecological category from, say, reservoirs; and they experience the presence of the pathogen in a different way. Transmission of malarial parasites from a mosquito to a human is not spillover. It’s something far more purposive and routine. Vectors seek hosts, because they need their resources (meaning, in most cases, their blood). Reservoirs do not seek spillover; it happens accidentally and it gains them nothing. Therefore malaria is not zoonotic, because the four kinds of malarial parasite that infect humans infect only humans. Monkeys have their own various kinds of malaria. Birds have their own. Human malaria is exclusively human. So I was told, and it seemed to be true at the time.
Spillover Page 13