More than any other chlorine-containing pesticide, the DDT molecule illustrates the conflict between potential benefit and hazard. DDT is a derivative of 1,1-diphenylethane; DDT is an abbreviation from the name dichloro-diphenyl-trichloroethane.
DDT was first prepared in 1874. That it was a potent insecticide was not realized until 1942, just in time for its use in World War II as a delousing powder to stop the spread of typhus and to kill the larvae of disease-carrying mosquitoes. “Bug bombs,” made from aerosol cans filled with DDT, were used extensively by the U.S. military in the South Pacific. These delivered a double blow to the environment, releasing large amounts of CFCs along with clouds of DDT.
Even before 1970, by which time three million tons of DDT had been manufactured and used, concerns about its effect on the environment and the development of insect resistance to it had surfaced. The effect of DDT on wildlife, particularly birds of prey such as eagles, falcons, and hawks that are at the top of food chains, are attributed not directly to DDT but instead to its main breakdown product. Both DDT and the breakdown product are fat-soluble compounds that accumulate in animal tissues. In birds, however, this breakdown product inhibits the enzyme that supplies calcium to their eggshells. Thus birds exposed to DDT will lay eggs with very fragile shells that often break before hatching. Starting in the late 1940s, a steep decline in the population of eagles, hawks, and falcons was noted. Major disturbances to the balance between useful and harmful insects, outlined by Rachel Carson in her 1962 book Silent Spring, were traced to increasingly heavy use of DDT.
During the Vietnam War, from 1962 to 1970, millions of gallons of Agent Orange—a mixture of chlorine-containing herbicides 2,4-D and 2,4,5-T—were sprayed over areas of Southeast Asia to destroy guerrilla-concealing foliage.
Although these two compounds are not particularly toxic, 2,4,5-T contains traces of a side product that has been implicated in the wave of birth defects, cancers, skin diseases, immune system deficiencies, and other serious health problems that affect Vietnam to this day. The compound responsible has the chemical name 2,3,7,8-tetrachlorodibenzodioxin—now commonly known as dioxin, though the word actually refers to a class of organic compounds that do not necessarily share the harmful properties of 2,3,7,8-tetrachlorodibenzodioxin.
2,3,7,8-tetrachlorodibenzodioxin, or dioxin
Dioxin is considered the most lethal compound made by man, although it is still a million times less deadly than nature’s most toxic compound, botulinum toxin A. In 1976 an industrial explosion in Seveso, Italy, allowed the release of a quantity of dioxin, with devastating results—chloracne, birth defects, cancer—for local people and animals. Afterward widespread media reporting of the event firmly established all compounds referred to as dioxins as villains in the mind of the public.
Just as unexpected human health problems accompanied the use of a defoliant herbicide, so too did unexpected human health problems appear with another chlorinated molecule, hexachlorophene, an extremely effective germicide product used extensively in the 1950s and 1960s in soaps, shampoos, aftershave lotions, deodorants, mouthwashes, and similar products.
Hexachlorophene
Hexachlorophene was also routinely used on infants and added to diapers, talcum powders, and other baby toiletries. But in 1972 tests showed that its use led to brain and nervous system damage in laboratory animals. Hexachlorophene was subsequently banned from over-the-counter preparations and baby products, but because it is so effective against certain bacteria, it still has a limited use, despite its toxicity, in prescription acne medications and for surgical scrub preparations.
MOLECULES THAT PUT YOU TO SLEEP
Not all chlorocarbon molecules have been disastrous for human health. Beyond the antiseptic properties of hexachlorophene, one small chlorine-containing molecule proved to be a boon for medicine. Until the mid- 1800s, surgery was performed without anesthesia—but sometimes with the administration of copious amounts of alcohol, in the belief that this would numb the agony. Some surgeons supposedly also imbibed in order to fortify themselves before inflicting such pain. Then in October 1846 a Boston dentist, William Morton, successfully demonstrated the use of ether as a way to induce narcosis—a temporary unconsciousness—for surgical procedures. Word of ether’s ability to allow painless surgery spread rapidly, and other compounds were soon being investigated for anesthetic properties.
James Young Simpson, a Scottish physician and professor of medicine and midwifery at the University of Edinburgh Medical School, developed a unique way of testing compounds as possible anesthetics. He would allegedly ask his dinner guests to join him in inhaling various substances. Chloroform (CHCl3), first synthesized in 1831, evidently passed the test. Simpson came to on the dining-room floor after the experiment with this compound, surrounded by his still comatose visitors. Simpson lost no time in employing chloroform on his patients.
The use of this chlorocarbon compound as an anesthetic had a number of advantages over ether: chloroform worked faster and smelled better, and less of it was required. As well, recovery from a procedure in which chloroform had been administered was faster and less unpleasant than from surgery using ether. The extreme flammability of ether was also a problem. It formed an explosive mixture with oxygen, and the smallest spark during a surgical procedure, even from metal instruments clanking together, could cause ignition.
Chloroform anesthesia was readily accepted for surgical cases. Even though some patients died, the associated risks were considered small. As surgery was often a last resort and as patients sometimes died from shock during surgery without anesthetics anyway, the death rate was deemed acceptable. Because surgical procedures were performed rapidly—a practice that had been essential without anesthesia—patients were not exposed to chloroform for any great length of time. It has been estimated that during the American Civil War almost seven thousand battlefield surgeries were performed under chloroform, with fewer than forty deaths due to the use of the anesthetic.
Surgical anesthesia was universally recognized as a great advance, but its use in childbirth was controversial. The reservations were partly medical; some physicians rightly expressed concerns about the effect of chloroform or ether on the health of an unborn child, citing observations of reduced uterine contractions and lowered rates of infant respiration with a delivery under anesthesia. But the issue was about more than just infant safety and maternal well-being. Moral and religious views upheld the belief that the pain of labor was necessary and righteous. In the Book of Genesis women, as Eve’s descendants, are condemned to suffer during childbirth as punishment for her disobedience in Eden: “In sorrow thou shalt bring forth children.” According to strict interpretation of this biblical passage, any attempt to alleviate labor pain was contrary to the will of God. A more extreme view equated the travails of childbirth with atonement for sin—presumably the sin of sexual intercourse, the only means of conceiving a child in the mid- nineteenth century.
But in 1853 in Britain, Queen Victoria delivered her eighth child, Prince Leopold, with the assistance of chloroform. Her decision to use this anesthetic again at her ninth and final confinement—that of Princess Beatrice in 1857—hastened the acceptance of the practice, despite criticism leveled against her physicians in The Lancet, the respected British medical journal. Chloroform became the anesthetic of choice for childbirth in Britain and much of Europe; ether remained more popular in North America.
In the early part of the twentieth century a different method of pain control in childbirth gained rapid acceptance in Germany and quickly spread to other parts of Europe. Twilight Sleep, as it was known, consisted of administration of scopolamine and morphine, compounds that were discussed in Chapters 12 and 13. A very small amount of morphine was administered at the beginning of labor. It reduced pain, although not completely, especially if the labor was long or difficult. Scopolamine induced sleep and, more important for the doctors endorsing this combination of drugs, ensured that a woman had no memory of her deliver
y. Twilight Sleep was seen as the ideal solution for the pain of childbirth, so much so that a public campaign promoting its use began in the United States in 1914. The National Twilight Sleep Association published booklets and arranged lectures extolling the virtues of this new approach.
Serious misgivings expressed by members of the medical community were labeled as excuses for callous and unfeeling doctors to retain control over their patients. Twilight Sleep became a political issue, part of the larger movement that eventually gained women the right to vote. What seems so bizarre about this campaign now is that women believed the claims that Twilight Sleep removed the agony of childbirth, allowing the mother to awaken refreshed and ready to welcome her new baby. In reality women suffered the same pain, behaving as if no drugs had been administered, but the scopolamine-induced amnesia blocked any memory of the ordeal. Twilight Sleep provided a false picture of a tranquil and trouble-free maternity.
Like the other chlorocarbons in this chapter, chloroform—for all its blessings to surgical patients and the medical profession—also turned out to have a dark side. It is now known to cause liver and kidney damage, and high levels of exposure increase the risk of cancer. It can damage the cornea of the eye, cause skin to crack, and result in fatigue, nausea, and irregular heartbeat, along with its anesthetic and narcotic actions. When exposed to high temperatures, air, or light, chloroform forms chlorine, carbon monoxide, phosgene, and/or hydrogen chloride, all of which are toxic or corrosive. Nowadays working with chloroform requires protective clothing and equipment, a far cry from the splash-happy days of the original anesthetic providers. But even if its negative properties had been recognized more than a century ago, chloroform would still have been considered a godsend rather than a villain by the hundreds of thousands who thankfully inhaled its sweet-smelling vapors before surgery.
There is no doubt that many chlorocarbons deserve the role of villain, although perhaps that label would be better applied to those who have knowingly disposed of PCBs in rivers, argued against the banning of CFCs even after their effects on the ozone layer were demonstrated, indiscriminately applied pesticides (both legal and illegal) to land and water, and put profit ahead of safety in factories and laboratories around the world.
We now make hundreds of chlorine-containing organic compounds that are not poisonous, do not destroy the ozone layer, are not harmful to the environment, are not carcinogenic, and have never been used in gas warfare. These find a use in our homes and industries, our schools and hospitals, and our cars and boats and planes. They garner no publicity and do no harm, but they cannot be described as chemicals that changed the world.
The irony of chlorocarbons is that those that have done the most harm or have the potential to do the most harm seem also to have been the very ones responsible for some of the most beneficial advances in our society. Anesthetics were essential to the development of surgery as a highly skilled branch of medicine. The development of refrigerant molecules for use in ships, trains, and trucks opened new trade opportunities; growth and prosperity followed in undeveloped parts of the world. Food storage is now safe and convenient with home refrigeration. We take the comfort of air-conditioning for granted, and we assume our drinking water is safe and that our electrical transformers will not burst into flames. Insect-borne diseases have been eliminated or greatly reduced in many countries. The positive impact of these compounds cannot be discounted.
17. MOLECULES VERSUS MALARIA
THE WORD MALARIA means “bad air.” It comes from the Italian words mal aria, because for many centuries this illness was thought to result from poisonous mists and evil vapors drifting off low-lying marshes. The disease, caused by a microscopic parasite, may be the greatest killer of humanity for all time. Even now there are by conservative estimates 300 million to 500 million cases a year worldwide, with two to three million deaths annually, mainly of children in Africa. By comparison, the 1995 Ebola virus outbreak in Zaire claimed 250 lives in six months; more than twenty times that number of Africans die of malaria each day. Malaria is transmitted far more rapidly than AIDS. Calculations estimate that HIV-positive patients infect between two and ten others; each infected malaria patient can transmit the disease to hundreds.
There are four different species of the malaria parasite (genus Plasmodium ) that infect humans: P. vivax, P. falciparum, P. malariae, and P. ovale. All four cause the typical symptoms of malaria—intense fever, chills, terrible headache, muscle pains—that can recur even years later. The most lethal of these four is falciparum malaria. The other forms are sometimes referred to as “benign” malarias, although the toll they take on the overall health and productivity of a society is anything but benign. Malaria fever is usually periodic, spiking every two or three days. With the deadly falciparum form this episodic fever is rare, and as the disease progresses, the infected patient becomes jaundiced, lethargic, and confused before lapsing into a coma and dying.
Malaria is transmitted from one human to another through the bite of the anopheles mosquito. A female mosquito requires a meal of blood before laying her eggs. If the blood she obtains comes from a human infected with malaria, the parasite is able to continue its life cycle in the mosquito gut and be passed on when another human supplies the next meal. It then develops in the liver of the new victim; a week or so later it invades the bloodstream and enters the red blood corpuscles, now available to another bloodsucking anopheles.
We now consider malaria to be a tropical or semitropical disease, but until very recently it was also widespread in temperate regions. References to a fever—most probably malaria—occur in the earliest written records of China, India, and Egypt from thousands of years ago. The English name for the disease was “the ague.” It was very common in the low-lying coastal regions of England and the Netherlands—areas with extensive marshlands and the slow-moving or stagnant waters ideal for the mosquito to breed. The disease also occurred in even more northern communities: in Scandinavia, the northern United States, and Canada. Malaria was known as far north as the areas of Sweden and Finland near the Gulf of Bothnia, very close to the Arctic Circle. It was endemic in many countries bordering the Mediterranean Sea and the Black Sea.
Wherever the anopheles mosquito thrived, so did malaria. In Rome, notorious for its deadly “swamp fever,” each time a papal conclave was held, a number of the attending cardinals would die from the disease. In Crete and the Peloponnesus peninsula of mainland Greece, and other parts of the world with marked wet and dry seasons, people would move their animals to the high hill country during the summer months. This may have been as much to escape malaria from the coastal marshes as to find summer pastures.
Malaria struck the rich and famous as well as the poor. Alexander the Great supposedly died of malaria, as did the African explorer David Livingstone. Armies were particularly vulnerable to malaria epidemics; sleeping in tents, makeshift shelters, or out in the open gave night-feeding mosquitoes ample opportunity to bite. Over half the troops in the American Civil War suffered from annual bouts of malaria. Can we possibly add malaria to the woes suffered by Napoleon’s troops—at least in the late summer and fall of 1812, as they began their great push to Moscow?
Malaria remained a worldwide problem well into the twentieth century. In the United States in 1914 there were more than half a million cases of malaria. In 1945 nearly two billion people in the world were living in malarial areas, and in some countries 10 percent of the population was infected. In these places malaria-related absenteeism in the workforce could be as high as 35 percent and up to 50 percent for schoolchildren.
QUININE-NATURE’S ANTIDOTE
With statistics like these it is little wonder that for centuries a number of different methods have been used to try to control the disease. They have involved three quite different molecules, all of which have interesting and even surprising connections to many of the molecules mentioned in earlier chapters. The first of these molecules is quinine.
High in the Andes, betwe
en three thousand and nine thousand feet above sea level, there grows a tree whose bark contains an alkaloid molecule, without which the world would be a very different place today. There are about forty species of this tree, all of which are members of the Cinchona genus. They are indigenous to the eastern slopes of the Andes, from Colombia south to Bolivia. The special properties of the bark were long known to the local inhabitants, who surely passed on the knowledge that a tea brewed from this part of the tree was an effective fever cure.
Many stories tell how early European explorers in the area found out about the antimalarial effect of cinchona bark. In one a Spanish soldier suffering a malarial episode drank water from a pond surrounded by cinchona trees, and his fever miraculously disappeared. Another account involves the countess of Chinchón, Doña Francisca Henriques de Rivera, whose husband, the count of Chinchón, was the Spanish viceroy to Peru from 1629 to 1639. In the early 1630s Doña Francisca became very ill from malaria. Traditional European remedies were ineffectual, and her physician turned to a local cure, the cinchona tree. The species was named (although misspelled) after the countess, who survived thanks to the quinine present in its bark.
These stories have been used as evidence that malaria was present in the New World before the arrival of Europeans. But the fact that the Indians knew that the kina tree—a Peruvian word, which in Spanish became quina—cured a fever does not prove that malaria was indigenous to the Americas. Columbus arrived on the shores of the New World well over a century before Doña Francisca took the quinine cure, more than enough time for malarial infection to find its way from early explorers into local anopheles mosquitoes and spread to other inhabitants of the Americas. There is no evidence that the fevers treated by quina bark in the centuries before the conquistadors arrived were malarial. It is now generally accepted among medical historians and anthropologists that the disease was brought from Africa and Europe to the New World. Both Europeans and African slaves would have been a source of infection. By the mid-sixteenth century the slave trade to the Americas from West Africa, where malaria was rife, was already well established. In the 1630s, when the countess of Chinchón contracted malaria in Peru, generations of West Africans and Europeans harboring malarial parasites had already established an enormous reservoir of infection awaiting distribution throughout the New World.
Penny le Couteur & Jay Burreson Page 29