by David Isaacs
That same year, nine children in Camden, New Jersey, died after being given smallpox vaccine also contaminated with tetanus.
At this time vaccines were often manufactured privately in the United States, with little supervision of the way the serum was produced. The next year, 1902, the United States passed the Biologics Control Act, which regulated vaccine and antitoxin production, required manufacturers to be licensed and mandated regular inspections of manufacturing facilities. Vaccine quality and safety improved, although, as we will see, the new procedures were not infallible.
The Lübeck tragedy
These American cautionary tales were not heeded sufficiently elsewhere. BCG vaccine to protect against tuberculosis was first made in France in 1921. In December 1929, the Lübeck General Hospital in northern Germany started to offer BCG vaccine to newborn babies, to reduce the risk of catching TB. Vials of BCG vaccine were shaken and some vaccine was added to a spoon of warm breast milk and given to the babies to drink.
Tragically, some vaccine vials were inadvertently contaminated with live Mycobacterium tuberculosis from the laboratory where the BCG was prepared. Over the next four months, 251 newborn babies were fed contaminated vaccine. While older children and adults usually develop no symptoms when infected with TB, newborns are more susceptible. Before they were three months old, babies delivered at the hospital were starting to die at a much higher rate than expected. Within a year, 90% of babies given the contaminated vaccine had developed clinical or X-ray evidence of TB, mostly in the lymph nodes, and 72 babies had died.
The Lübeck accident was global news, and a setback to acceptance of immunisation in general and BCG vaccine in particular. To this day, the United States and the Netherlands have never routinely immunised children with BCG.
The Bundaberg tragedy
Bundaberg is a small city in sugar-cane country four hours’ drive north of Brisbane, at the southern tip of the Great Barrier Reef. Bundaberg is famous for its rum, and infamous for an early vaccine disaster.
In the 1920s, diphtheria was a major problem in Bundaberg; during 1926 and 1927 more than 200 locals were infected with the disease and three died. An immunisation program was started in January 1928. On the 10th day of the program, 21 children aged 23 months to seven years were given the vaccine. Within hours, 18 of them became desperately ill. On 30 January 1928, the Bundaberg Daily News and Mail reported:
Begun on January 17, the inoculation went apparently well until Friday night when children who were given injections that day showed alarming symptoms. These developed rapidly to virulent and fatal blood poisoning resembling anthrax. Twelve children on Saturday died despite all efforts to save them and several more are in critical and dangerous states.
Consternation prevails . . . Last night the Mayor set in train with the Authorities concerned proposals to the Education Department to allow state school children to parade to-day as a tribute to their dead comrades.
It is an old but true saying ‘in the midst of life we are in death’. Today Bundaberg mourns the loss of twelve fine little children, whose bright young lives were nipped in the bud, as the result of some maleficent quality that developed or was hidden in the immunising serum injected into their arms on Friday for the purpose of assuring their safety against the dreaded diphtheria that is so threatening at this time.
The victims are:–
Thomas Robinson, 5½ years
William Robinson, 4 years
Mervyn Robinson, 23 months
Edward Baker, 5 years
Keith Baker, 3 years
George Baker, 6 years
Marsden Coates, 7 years
William Follitt, 2½ years
Mary Sheppard, 5 years
Monica Sheppard, 2½ years
Myrtle Brennan, 3½ years
Joan Peterson, 5½ years
One family lost all three of their sons, and one of the city councillors lost his two sons. The most likely cause was that the vaccine vial used for all the injections was accidentally contaminated with a toxin-producing strain of the Staphylococcus aureus bacterium (‘golden staph’). The tragedy was considered a public health disaster in Australia and it attracted attention globally. A royal commission was set up to investigate the causes.
One positive outcome of the tragedy was that it taught authorities important safety lessons. The contamination probably occurred because of repeated puncturing of a rubber bung on a multi-dose vial. Multi-dose vaccine vials were not used again for many years. However, they were used without mishap to deliver influenza vaccine during the 2011 influenza pandemic, because of the need to give many doses in a very short time.
The Bundaberg vaccine was not refrigerated and the Queensland January heat probably contributed to the growth of the contaminating bacteria. The tragedy led to recognition of the importance of keeping vaccines refrigerated to maintain the ‘cold chain’. Vaccines can still deteriorate if left unrefrigerated all day, and the problems can be compounded if the same syringe is used to give multiple doses, as happened in a remote village in South Sudan in 2017, when 15 children died from contaminated vaccine during a poorly conducted measles immunisation program. The program nevertheless ended a measles epidemic and saved hundreds of lives.
The Bundaberg vaccine also did not contain any preservatives or antiseptics to prevent contamination. Almost all vaccines now contain small amounts of disinfectants, such as phenol or phenoxyethanol, to prevent contamination with bacteria or fungi. The evidence suggests that these are not harmful and perform an essential role in maintaining sterility.
From time to time concerns have been raised about preservatives, particularly thiomersal (sometimes called thimerosal), which contains minuscule amounts of mercury and was once used commonly in vaccines. Although there is less mercury in any vaccine than we consume when eating fish, it is very difficult to prove scientifically that something is definitely not harmful. The United States Institute of Medicine reported in 2001 that there was insufficient evidence to prove or disprove whether thiomersal causes autism, attention deficit hypersensitivity disorder, or speech or language delay. In 2004 the institute reported that the evidence now favours ‘rejection of a causal relationship between thimerosal-containing vaccines and autism’.
When, in the late 1990s, fears of possible harm from mercury poisoning were raised in Europe and North America, the response of most affluent countries was to try to reassure the public that thiomersal was safe, while instituting measures to remove it from most vaccines or reduce its concentration considerably. This sort of mixed message – ‘Thiomersal is safe, but we’ll remove it just to be sure’ – is at best confusing and at worst bad for public confidence in vaccine safety. Some experts who were convinced there was firm evidence that thiomersal was not harmful felt it would have been better to try to reassure the public while not changing the vaccines.
Vaccines also contain non-toxic stabilising additives such as gelatin, lactose, sucrose, sorbitol and mannitol. Furthermore, as we have discussed, since the 1920s very small amounts of aluminium salts (alum) have been added to many vaccines as an adjuvant. The amount of aluminium in vaccines is minimal; there is more in most normal foods and far more in antacids. Adjuvants are only used when vaccines do not stimulate an adequate immune response without them.
Concerns about alum arose when studies suggested a possible link between chronic aluminium exposure and Alzheimer’s disease. Though this link remains controversial and unproved, papers continue to be published in journals questioning the safety of aluminium in vaccines. In 2004, Tom Jefferson, an American expert on vaccine safety and efficacy, reviewed five studies involving a total of over 12,000 infants and children, comparing DTP vaccines containing aluminium with those containing no aluminium. He found no evidence that aluminium salts in vaccines cause any serious or long-lasting adverse events.
Clearly, we need to be sure that any new adjuvants we use are safe. We can only know this by studying the effects of vaccines containing them. If th
e vaccines prove safe, the adjuvants must be too. There will always be theoretical concerns that adverse effects may not emerge for many years, but similar concerns apply to many products we use, including foods.
There is a saying ‘to pour oil on troubled waters’. Oil and water do not mix. In the body too, oily substances like lipids repel water, yet we would like to deliver vaccines to all parts of the body. Many of the new adjuvants we use have been designed to deliver the vaccines to remote sites. Liposome adjuvants are small round particles comprising a drop of liquid enclosed in one or more layers of lipid – effectively a water droplet enclosed in fat. Water-soluble proteins can be trapped in the liquid centre and protected by the fat layer until they reach the desired target site. The adjuvant AS01, used in a malaria vaccine given to children in clinical trials, is liposome-based.
Oil-in-water emulsions can also be used as adjuvants. An example is MF-59. The adjuvanted influenza vaccine introduced in 2018 in Australia to give better protection to the elderly contains MF-59.
The oil most commonly used in oil-in-water adjuvants is squalene oil, a naturally occurring oily compound. Shark-liver oil was the original source of squalene, but squalene oils can also be sourced from various plants.
Some new adjuvants are saponins, soapy substances that can make cell membranes leaky and allow proteins to penetrate. Saponins are also naturally occurring substances found mainly in plants; they can be sourced from sea cucumbers.
Studies of over 25,000 children exposed to vaccines containing one of four new adjuvants – AS01, AS02, AS03 and MF59 – showed they were significantly less likely to experience immediate serious side effects than children receiving conventional vaccines. Although the long-term adverse effects still need to be monitored, these results are encouraging.
The yellow fever vaccine
A cynical view might be that the Western world only gets interested in diseases of third-world countries when they affect war or trade or both. Yellow fever is a disease of developing countries in West Africa and Central and South America, but a vaccine only came about when the United States wanted to protect workers building the Panama Canal around 1900. Sadly, this vaccine has had a chequered history, both before and since its development.
Yellow fever virus infects and inflames the liver, causing hepatitis and jaundice. Jaundice is yellow discolouration of the skin caused by build-up in the bloodstream of a pigment, bilirubin, a breakdown product of blood cells normally cleared by the liver. The name ‘yellow fever’ derives from this jaundice. The virus is transmitted to humans by mosquitoes that have bitten other infected humans or monkeys. It is another example of a human infection that originated in animals – in this case monkeys. Most Westerners only know about it because they have to get yellow fever vaccine to travel to Africa or South America, but it has been and remains a major killer in those areas. In 2013, the WHO estimated there were 84,000 to 170,000 severe cases of yellow fever and 29,000 to 60,000 deaths worldwide, of which 90% were in Africa.
Yellow fever started in Africa and spread to Central and South America with the slave trade. It became particularly important to the Western world because of the disruption it caused to trade, and was sometimes called ‘yellow jack’ after the yellow quarantine flag hoisted by affected ships.
Yellow fever even spread to the United States, causing an outbreak in New York in 1668, and one in Philadelphia in 1793, which killed 9% of the city’s population. George Washington’s government was based in Philadelphia and he and it fled the city. Yellow fever also spread to Spain in the 19th century, causing outbreaks in Gibraltar and Barcelona, which killed thousands.
In the late 1800s Western powers came up with the idea of cutting an artificial canal across the narrow Isthmus of Panama in Central America to provide a maritime trade waterway between the Atlantic and Pacific Oceans, allowing ships to bypass treacherous Cape Horn on the southern tip of South America. The French started building the Panama Canal in 1881 but abandoned the project. This was due partly to engineering problems – but mainly to worker deaths from tropical diseases, particularly yellow fever and malaria.
In 1902 the American government of Theodore Roosevelt passed the ‘Spooner Act’ (Senator Spooner wrote the Act) which paid the French a large sum to take over building the Panama Canal. Knowing the problems the French had faced, the United States military had appointed a commission in 1900, headed by army doctor and yellow fever specialist Walter Reed, to investigate the cause of yellow fever and how to prevent it.
A British doctor, Ronald Ross, working in India, had shown in 1896 that malaria was transmitted through mosquito bites. Reed’s team performed experiments that established that yellow fever is also transmitted by infected mosquitoes. These studies led the American military to literally pour oil on the troubled Panama waterways to prevent mosquitoes from breeding, and to apply larvicides if they did. This successfully prevented most of the yellow fever in Panama.
But the research was not without its failures. Although Reed was reasonably cautious, three United States army volunteers died from being injected with small doses of yellow fever.
One promising researcher on Reed’s team, 34-year-old Dr Jesse Lazear, wrote a letter to his wife in September 1900: ‘I rather think I am on the track of the real germ.’ Unfortunately the germ was on his track. Two and a half weeks later, Dr Lazear died from yellow fever. He had allowed an infected mosquito to bite him in an attempt to prove that mosquitoes transmitted the disease. The military covered up the fact that Dr Lazear had infected himself deliberately, and the truth was only discovered in 1947 from Dr Lazear’s own notebook.
The tragedies connected with yellow fever did not end in Panama. In the 1930s a live attenuated virus vaccine against yellow fever was developed. It soon became the third most widely used human vaccine after the smallpox and rabies vaccines. During World War II, fighting in yellow-fever-endemic zones such as North Africa led to a huge demand for yellow fever vaccine. Over 15 months from 1941 to 1942, the United States Army immunised 7 million troops. However, the vaccine had been stabilised with human serum, and one of the serum donors had had chronic hepatitis B infection. A little hepatitis B goes a long way: the entire supply of yellow fever vaccine was contaminated with hepatitis B virus. This led to many thousands of cases of hepatitis B, and an estimated 100 deaths. The Chicago Tribune sourly pointed out that 20 times as many soldiers had fallen victim to the vaccine in North Africa as had been wounded in battle.
Today the vaccine is safe and is given to travellers to countries where yellow fever is endemic, and the WHO is increasingly using it to prevent the disease. Between 2005 and 2016, over 105 million people were vaccinated across 14 of 34 endemic African countries.
The Cutter Incident
We heard in Chapter 4 about the race to develop a polio vaccine, and the fierce rivalry between two brilliant men, virologists Albert Sabin and Jonas Salk. Salk’s killed (inactivated) polio vaccine (IPV) was first off the blocks, ready to be tested in a large randomised controlled trial by 1954.
The American public, petrified of a disease that could infect thousands of children in a few days, was so desperate that compromises were being made even before the trial results were announced.
Jonas Salk had struggled to prepare enough safe vaccine in which all the poliovirus was definitely killed and which stimulated an immune response when injected. Salk could not make industrial quantities of IPV himself, so pharmaceutical companies were encouraged to apply for licences to make the vaccine, the understanding being that they were almost certain to get them. One of these companies, Eli Lilly, paid US$250,000 (equivalent to US$2 million now) to broadcast the announcement of the trial results on closed-circuit television to cinemas across the land. No conflict of interest there, then.
At 2.45pm, just three hours after the announcement of the trial findings, a licensing committee met to consider applications to manufacture IPV from five pharmaceutical companies: Eli Lilly, Parke-Davis, Pitman-Moore, Wyeth and Cutt
er Laboratories. The committee was looking for the first time at applications that nowadays would take a year to approve. They were a little pressed for time, because the United States Secretary of Health had arranged a press conference for 4.15pm that day, when she would publicly sign the licences. No pressure.
All five companies were granted licences and cardboard boxes of vaccines were shipped to health clinics the same evening. The speed of all this was breathtaking. But not all vaccines are created equal (with apologies to George Orwell for so mangling his wonderful line from Animal Farm).
All five companies had different methods of preparing the vaccine. There were safeguards: each company had to prove to the national Laboratory of Biologics Control that it had made 11 consecutive batches of IPV that contained no live poliovirus. However, these stringent requirements were relaxed as the intended vaccine launch approached, and companies were not required to document their failures along the way.
Bernice Eddy from the Laboratory of Biologics Control tested a preliminary IPV vaccine submitted by California-based family firm Cutter Laboratories. She inoculated the vaccine into the brains of 12 anaesthetised monkeys and into the muscles of another six. Half the batches Cutter submitted caused paralysis in the monkeys. There must have been live virus in the vaccine to give the monkeys polio. Although these batches of vaccine were never given to the public, Eddy feared Cutter Laboratories was using a flawed process. ‘There’s going to be a disaster. I know it,’ she confided to a friend. She told her boss, but he did not inform the licensing committee.
Two weeks after the launch of the new vaccine, disaster struck. A contemporary scientific report detailed: ‘On April 25, 1955, an infant with paralytic poliomyelitis was admitted to Michael Reese Hospital, Chicago, Illinois. The patient had been inoculated in the buttock . . . on April 16, and developed flaccid paralysis of both legs on April 24.’ Within days, the problem had been traced to the Cutter vaccine.