In a historic decision, Searle agreed to move forward with the manufacture of the first commercial oral contraceptive. Fortunately, they did not overlook the troubling side effects from the Puerto Rico trials, which the pharma company took very seriously. Searle scientists adjusted the formulation of Rock and Pincus’s synthetic progesterone compound to reduce the amount of breakthrough bleeding and other adverse symptoms. The result was a small white tablet, not terribly different in size and weight from an aspirin. Sanger was ecstatic. The impossible dream of a lifelong feminist had somehow become reality.
Searle anointed the pill with the trade name Enovid. The FDA approved Enovid as a contraceptive in February 1961, and five months later Searle started marketing Enovid to the public, seven years after Gregory Pincus had received his first check from Katharine McCormick and fourteen years after Russell Marker had set up his private progesterone lab in a Mexican pottery shed. At the age of eighty-five, Katharine Dexter McCormick celebrated the event by being one of the very first women in the United States to walk into a drugstore to have her birth control pill prescription filled.
Within two years of Enovid’s release, 1.2 million American women were on the Pill. By 1965, this number had risen to five million. The drug that no company wanted to touch turned out to be Searle’s best-selling product for more than a decade, far outstripping its sales of glucocorticoids. By the late 1960s, seven pharmaceutical companies were producing oral contraceptives and more than twelve million women were taking the Pill worldwide. Today, more than 150 million prescriptions for the Pill are written each year.
There are few medical inventions in history that transformed the basic fabric of society so quickly and so dramatically. Rock and Sanger both viewed the pill primarily as a public health measure, to prevent physical deterioration from excessive pregnancies, and secondarily as a way to improve the financial stability of impoverished women who could not afford to raise additional children. Arrayed against them were social conservatives who argued that the Pill would encourage women to engage in society-destroying promiscuous sex. But the reality was quite different than what anyone imagined.
“‘Someone said once that not everyone with vocal cords is an opera singer. And not everyone with a womb needs to be a mother,” asserted Gloria Steinem. “When the Pill came along we were able to give birth—to ourselves.” Women could now pursue a career as a doctor, lawyer, or business executive on their own timetable. The average family size plummeted, and family size soon became inversely proportional to family income, a clear indication that birth control was fully embraced by the educated and wealthy classes.
The Pill made it possible for women to control their fertility without depending on their partners, and in a manner that was disconnected from the sexual act itself. While the Pill was certainly not the first contraceptive intended to work this way—the sixth-century medical writings of Aetios of Amida advised women to avoid pregnancy by wearing cat testicles in a tube tied around the waist—the Pill was the first one that actually worked.
Lynne Luciano, a history professor at California State University with an interest in women’s issues, points out how the Pill changed society’s basic perception of sex. “In psychology journals, prior to 1970, frigidity was listed as a major problem for women. Today, frigidity has practically vanished from the literature. It’s been replaced by erectile dysfunction and premature ejaculation, which were never considered problems before.”
Not everything changed, however. Ever the idealist, John Rock had always maintained that oral contraception was compatible with the Catholic faith. The pope thought otherwise and explicitly banned the Pill in the encyclical “Humanae Vitae,” a policy statement authored by Pope Paul VI in 1968 to reaffirm the orthodox teachings of the Catholic Church. Yet when Rock was confronted with the Church’s opposition to his revolutionary drug, instead of terminating his involvement with oral conception, he discovered that he was an idealist first and a Catholic second. After a lifetime of attending daily mass, he ceased going to church altogether. Despite the pope’s ban, millions of Catholic women around the world also chose to follow their own conscience and committed the sin of swallowing the little white tablet.
The Pill did not originate in a Big Pharma science lab or a sales team meeting. First, Swiss dairy farmers who wanted to make their cows pregnant faster made a peculiar anatomical discovery. Then, a veterinary professor published this finding, leading to the identification of progesterone as an anti-ovulation drug. An eccentric and solitary chemist figured out how to make progesterone simply because it was an interesting puzzle. Two septuagenarian feminists selected a discredited biologist to realize their dream of creating an oral contraceptive. A devout and hopelessly idealistic Catholic gynecologist agreed to run the world’s first human trials of the oral contraceptive. Together, the biologist and gynecologist dodged federal and state laws—and medical ethics—by holding trials in Puerto Rico and ignoring clear signs of adverse side effects. They only succeeded in convincing a pharmaceutical company terrified of Catholic boycotts to manufacture the drug after the company fortuitously noticed that women were spontaneously using one of their other drugs for the off-label purpose of contraception.
This, in a nutshell, is why it is so hard to develop new medicines. Imagine you want to replicate this process: “Can we develop a cure for baldness the same way we developed a birth control pill?” To become a successful drug hunter requires talent, moxie, persistence, luck—and even then, it might not be enough. And we should not overlook Big Pharma’s frustrating and unhelpful role in this process. Every single pharma company rejected Pincus and Sanger’s proposals when they solicited the companies for help developing the Pill. A previously hostile pharma company jumped in only after an independent team of drug hunters sweated and bled their way to an FDA-approvable clinical trial entirely on their own.
The modern drug development process is drastically unfair and completely unreasonable, and yet it still managed to significantly improve the lives of hundreds of millions of women. And this is the true nature of drug hunting.
12
Mystery Cures
Discovering Drugs Through Blind Luck
James Lind treating sailors with scurvy
“A sick thought can devour the body’s flesh more than fever or consumption.”
—Guy de Maupassant, Le Horla et autres contes fantastiques
One of the most basic truths of drug hunting is the uncomfortable fact that the vast majority of important drugs were discovered without the foggiest idea of how the drug actually worked. It often takes decades before researchers decipher how a new drug fully operates on the body. In many cases, despite generations of investigation, we still do not fully comprehend how a particular medication works. For instance, as of 2016, gaseous surgical anesthetics (such as halothane), modafinil (a narcolepsy drug), and riluzole (an ALS drug) all remain pharmaceutical mysteries. For physicians, this lack of understanding can be somewhat unsettling. But for the drug hunter, it can be liberating.
Anyone with an alert mind stands a chance of identifying a potentially useful compound and converting it into a medication, even if they possess little knowledge of biological mechanisms. During the Age of Plants, of course, drug hunters possessed zero understanding of how medicines worked. Drug discovery was 100 percent trial and error. Until Ehrlich proposed his receptor theory in the early twentieth century, theories of how drugs worked ranged from the misguided (such as the proposition that drugs changed the shape of cells) to the ludicrous (such as the conviction that the cure for a given disease would come from a plant that physically resembled the diseased organ). Even so, sometimes even the most ignorant of beliefs can serve as the catalyst for a crucial discovery. Simply having the motivation to proceed—any motivation—can stir a drug hunter to continue down the rugged path of exploration. In fact, the very first scientific experiment on the curative effects of a drug was the result of a fallacious assumption.
Scurvy is a
horrible affliction recognized since antiquity. In the fifth century BC, Hippocrates identified its symptoms of bleeding gums and body-wide hemorrhaging, followed by death. Scurvy was fairly uncommon in ancient times, however, because ocean voyages were rarely very long. But the disease began to explode at the dawn of the fifteenth century as Europeans began to tackle extended sea journeys as they ventured to distant continents. In the midst of a sustained ocean cruise, vigorous and healthy sailors would suddenly collapse.
Some historians state that scurvy caused more deaths in the British fleet in the eighteenth century than from all French and Spanish arms combined. Richard Walter, the chaplain of Commodore George Anson’s failed attempt to circumnavigate the globe, wrote up an official account of the voyage. Anson set out from England on September 18, 1740, with six warships and 1,854 men. By the time the expedition returned home four years later, only 188 remained alive. Most had perished from scurvy, which Walter documented in his reports. He described ulcers, difficult respiration, rictus of the limbs, skin as black as ink, teeth falling out, and—perhaps most disquieting of all—a foul corruption of the gums that endowed the victim’s breath with an abominable odor.
Scurvy also seems to affect the nervous system by shutting down sensory inhibitors, making the victim extremely sensitive to taste, smell, and sounds. The fragrance of flowers on the shore can cause a victim to moan in agony, while the crack of gunfire can be enough to kill a man in the advanced throes of the disease. In addition, victims’ emotions often become unmanageable, so that they cry out at the slightest disappointment and yearn disconsolately for home.
In the eighteenth century, no one knew what caused scurvy, so no one had any idea how to prevent it or treat it. The medical establishment’s best guess was that scurvy was a disease of putrefaction and thus best treated by acids, including elixir of vitriol (sulfuric acid), preparations believed to slow the rotting process. It was not clear that the acid treatments were helpful, though, so eventually a Scottish physician decided to put the acid theory to the test.
James Lind was appointed the ship’s surgeon of the HMS Salisbury in the Channel Fleet in 1747. After the ship was two months at sea, sailors began falling ill with scurvy. Lind took the opportunity to launch his experiment. His approach was sensible and straightforward: he applied a variety of different acids to his scurvy patients and evaluated the results. Lind divided twelve of the sick sailors into six groups of two, an extremely small sample size by modern standards. All his patients received the same diet, but each pair of sailors was treated with a different kind of acid. The first group received a quart of (lightly acidic) cider, the second group received twenty-five drops of elixir of vitriol (the most highly regarded remedy at the time), the third group received six spoonfuls of (lightly acidic) vinegar, the fourth group received two oranges and one lemon because citrus fruits are acidic, the fifth group received a spicy paste plus a drink of barley water (spices were another common treatment of scurvy because their effects were believed to be similar to acid). The sixth group, meanwhile, received a half a pint of seawater; this placebo treatment made the final pair of sailors the very first control group in a clinical drug trial.
After six days Lind ran out of fruit, so he had to terminate his tests on group four. Yet, amazingly, one of the citrus-treated sailors was already fit for duty while the other had almost completely recovered. None of the other sailors had recovered at all except for the pair treated with cider, who showed a mild improvement. Today, of course, the interpretation of these results is obvious. We now know that scurvy is a disease caused by a dietary deficiency of vitamin C, a compound required for the synthesis of collagen. Collagen provides the strength, structure, and resiliency for our connective tissues, including our blood vessels, and without enough collagen our connective tissue breaks down and produces the symptoms of scurvy, including bleeding and the reopening of old wounds. Citrus fruits contain high levels of vitamin C, while apple cider contains small amounts of vitamin C; none of the other treatments that Lind employed contain any vitamin C. Since fruits and vegetables could not be stored on long sea voyages, eighteenth-century sailors subsisted on cured meats and dried grains—a diet that lacked vitamin C.
Vitamin C itself was not discovered until the 1930s, almost two centuries after Lind’s pioneering experiment. So when Lind published A treatise of the scurvy in 1753, sharing the results of his acid evaluation, his findings was largely ignored. Even though he had shown that citrus fruits and apple cider were effective treatments for scurvy, he had no idea why, and without the why most physicians still clung to their familiar (but useless) acid treatments. Over time, though, many officers and surgeons came to realize that Lind was correct and that citrus fruits were indeed an effective answer to scurvy. More and more ships began providing their sailors with citrus fruits and citrus drinks on long journeys, dramatically reducing the incidence of the gum-rotting disease. Finally, in 1795—four decades after Lind’s study—the British navy officially adopted lemons and limes as standard issue at sea. It took almost another decade before the British naval supply chain could provide adequate citrus to their ships all around the globe. Limes came to be most popular since they were abundant in the British West Indian colonies (unlike lemons), leading Americans to endow British sailors with the nickname “limeys.”
One reason it was so difficult to identify the active ingredient in citrus fruits that prevented scurvy was that scientists were not able to produce scurvy in animals. Eventually, the medical establishment came to believe that scurvy was a disease that afflicted only Homo sapiens. Since it was not possible to conduct scurvy experiments on animals, the only way to test out the effect of different citrus fruit compounds was to use scurvy-rotting humans—but who would be willing to volunteer to endure the disgusting, painful disease for a medical experiment, especially considering that you might not even get treated with an effective compound? As a result, there was little progress in understanding how citrus fruits worked, until a stroke of good fortune befell two Norwegian scientists in 1907.
Alex Holst and Theodor Frolich were trying to induce beriberi in animals, a disease now known to be caused by a lack of vitamin B1. They fed guinea pigs a diet limited to grains and flour hoping to produce beriberi. To their surprise, the guinea pigs developed scurvy instead. This was a wildly lucky turn of events, because virtually every species of mammal is able to synthesize its own vitamin C within its body and thus does not require the vitamin in its diet. Holst and Frolich had fortuitously stumbled upon one of the precious few species other than humans that do not produce vitamin C internally. The two scientists recognized that they had just come up with an animal model of scurvy. Several teams began trying to identify the active ingredient in citrus fruits that prevented scurvy, and in 1931 scientists finally identified L-hexuronic acid as the critical compound. It was later renamed ascorbic acid, from a- (“no”) and -scorbutus (“scurvy”). It took another twenty-five years before scientists identified ascorbic acid’s role in building collagen. Thus, it was more than two centuries from the time that James Lind identified an effective anti-scurvy drug until the medical establishment finally unraveled how the drug worked.
Perhaps the broadest and most frequently prescribed class of “mystery” drugs today is the psychoactives—medications for mental illness. All the way into the 1950s, not only was there no therapeutic drug for schizophrenia, depression, or bipolar disorder, most members of the psychiatric establishment believed there could never be a drug to treat these disorders, since it was widely believed that mental illness was primarily due to unresolved childhood experiences. This was the central conviction of Sigmund Freud, whose theory of mental illness—known as psychoanalysis—swept through the United States in the early twentieth century. (Ironically, Freudianism was almost completely wiped out in Europe for the same exact reason it became so popular in America. The vast majority of the early psychoanalysts were Jews, as was Freud himself, and as the Nazis rose to power in Hitler’s Ge
rmany, these Jewish psychoanalysts fled Europe for the safety of American shores, moving the world capital of psychoanalysis from Vienna, Austria, to New York City. It was as if the Holy See of the Catholic Church moved from the Vatican to Manhattan.)
By 1940, psychoanalysts had taken over every position of power in American psychiatry, controlling university psychiatry departments and hospitals and completing a hostile takeover of the American Psychiatric Association. In addition, psychoanalysts drove a profound change in the basic nature of American psychiatry. Before the Freudians fled Nazi Europe, American psychiatry consisted almost entirely of “alienists”—psychiatrists who tended to the severely disabled mentally ill within mental institutions far away from population centers; the fact that mental asylums stood apart from good society gave rise to the moniker “alienist.” But the Freudians brought psychiatry into the American mainstream by insisting that everyone was “a little mentally ill” and that they could be fixed through relaxing therapy sessions in the comfortable offices of the psychoanalyst. Thus, the Freudians moved psychiatry out of remote, isolated institutions onto the couches of downtown offices and suburban homes.
Since psychoanalysts believed that patients could be cured only through “talk therapy”—exploring their childhood experiences through dreams, free association, and frank confession—they were convinced that no chemical could possibly bring about any positive change in a person suffering from mental illness. Consequently, there was absolutely no support for drug hunters questing for psychiatric medications. Through the 1950s, no Big Pharma company pursued any kind of program for mental illness drugs, no academic laboratory was looking for mental illness drugs, and very few mainstream hospitals were looking for evidence that a drug might improve the condition of its mental patients. Though there were still a few non-Freudian alienists who were dealing with severely sick schizophrenic and suicidal patients in remote mental asylums who still held out hope that there might one day be a pharmaceutical remedy, the entire medical profession took it for granted that there would never be a Salvarsan or insulin for mental illness. In such a hopelessly anti-drug environment, the only real hope for the development of a psychiatric drug was a false hypothesis and blind luck. But false hypotheses and blind luck have always been key ingredients of successful drug hunting.
The Drug Hunters Page 19