Pandora's Lab
Page 20
First introduced into the United States in 2006, e-cigarettes are battery-powered vaporizers that deliver nicotine but don’t contain tobacco. The liquid that is vaporized also contains propylene glycol, glycerol, and assorted candy or dessert flavorings (such as Belgian waffle and chocolate). Only a letter and a hyphen away from one of the most destructive products ever invented (cigarettes), e-cigarettes have been universally condemned by virtually every scientist, physician, and government official responsible for the public’s health. And it’s not hard to understand why.
First, nicotine is highly addictive and potentially harmful, especially to a developing fetus. It can also cause headaches, nausea, vomiting, dizziness, nervousness, and a rapid heartbeat. Although some brands of e-cigarettes don’t contain nicotine, most do.
Next, Big Tobacco companies like Altria, Reynolds, and Imperial make e-cigarettes. Although company executives claim that e-cigarettes are an exit strategy for those trying to quit smoking, they haven’t exactly earned the trust of the American public. In 2012, e-cigarette makers spent more than $18 million on magazine and television ads. Unlike ads promoting cigarettes, which were banned in 1971, companies are free to advertise e-cigarettes. As a result, e-cigarettes have become a $3.5-billion-a-year industry in the United States, with some predicting that sales will exceed those for conventional cigarettes by the mid-2020s.
Finally, reminiscent of the Joe Camel commercials, some advertisements for e-cigarettes are specifically designed to entice young people. When Julia Louis-Dreyfus was shown smoking an e-cigarette during the 2014 Golden Globe Awards, both Henry Waxman (D-CA) and Frank Pallone, Jr. (D-NJ), called the president of NBC to say that the actress was “sending the wrong message to kids about these products.” Protests by people like Waxman and Pallone have fallen on deaf ears; e-cigarettes have become enormously popular among the young. In 2013, about 250,000 minors who had never smoked a cigarette had tried an e-cigarette. In 2014, an estimated 1.6 million middle and high school students in the United States experimented with them, a dramatic increase from the previous year. Indeed, more than 10 percent of high school students in the United States have tried an e-cigarette. It seems like only a matter of time before this tidal wave of children using e-cigarettes will become a flood of adults smoking cigarettes—and dying from lung cancer as a result. E-cigarettes appear poised to add to the 480,000 deaths and $300 billion in direct health care expenditures and productivity losses caused each year by cigarette smoking.
For all of these reasons, the American Cancer Society, the American Lung Association, the Centers for Disease Control and Prevention (CDC), the World Health Organization, and the American Academy of Pediatrics all strongly oppose e-cigarettes. And, when I first approached this subject, I assumed that I would end up agreeing with them, wholeheartedly. There was, however, one problem: the data.
Associated with the dramatic rise in e-cigarettes during the past five years, cigarette smoking has declined to historically low levels, including among the young. For example, the CDC reported that although the use of e-cigarettes had tripled between 2013 and 2014, the incidence of cigarette smoking had declined dramatically. In 2005, 20.9 percent of adults smoked cigarettes; by 2014, only 16.8 percent did, a 20 percent drop. Indeed, in 2014, the number of Americans smoking cigarettes fell below 40 million for the first time in 50 years. Further supporting the notion that e-cigarettes were replacing cigarettes, states that had banned the sale of e-cigarettes to minors witnessed an increase in cigarette smoking in that age group. And there is no denying that e-cigarettes are safer; unlike cigarettes, they don’t produce tars that cause cancer or combustibles like carbon monoxide that cause heart disease. “People smoke for the nicotine but they die from the tar,” said Michael Russell, a pioneer of nicotine-cessation treatments.
Maybe this is all coincidence. Maybe there are other reasons that cigarette smoking is declining that have nothing to do with the rise of e-cigarettes. But it’s too early to condemn e-cigarettes as a gateway product to cigarette smoking when the opposite appears to be true. Time will tell. The point being that the cultural milieu that damns e-cigarettes is irrelevant; only the data are relevant. In August 2015, England’s Department of Health recommended e-cigarettes as an effective way to stop smoking. Nine months later, in April 2016, the Royal College of Physicians, an organization of British doctors founded in 1518, supported the Department of Health’s decision. British physicians had been influenced by a study in the United Kingdom showing that smokers who used e-cigarettes were much more likely to quit than those who used nicotine patches.
Like e-cigarettes, GMOs have also fallen victim to the zeitgeist.
GMOs are defined as any living organism that possesses “a novel combination of genetic material obtained through the use of modern biotechnology.” The key phrase here is “modern biotechnology,” because the truth is that we have been genetically modifying our environment since the beginning of recorded history. Using breeding or artificial selection, humans began to domesticate plants and animals around 12,000 B.C.—all for the purpose of selecting for certain genetic traits and all a precursor to modern genetic modification. Nonetheless, for environmentalists, no single act of hubris has been more terrifying than when scientists decided to recombine DNA in the laboratory to modify nature.
Today, the largest use of genetic bioengineering has been in food production. Genetic engineering has allowed crops to resist pests, tolerate extreme temperatures and environmental conditions, and be free of certain diseases. Genetically engineered crops have also been created to improve nutrient profiles, lengthen shelf lives, and resist herbicides. In the United States, 94 percent of soybeans, 96 percent of cotton, and 93 percent of corn are genetically modified; in the developing world, 54 percent of crops are genetically modified. The consequences, especially for farmers in the developing world, have been dramatic. GMO technology has reduced chemical pesticide use by 37 percent, increased crop yields by 22 percent, and increased profits for farmers by 68 percent. Although GMO seeds are more expensive, the cost is easily offset by reduced use of pesticides and higher yields.
Although many people fear that genetically modified foods might be more dangerous than other foods, careful scientific studies show they have no reason for concern. The American Association for the Advancement of Science and the National Academy of Sciences have both issued statements supporting the use of GMOs. Even the European Union, which has never been particularly supportive of GMOs, cannot ignore the science. In 2010, the European Commission issued the following statement: “The main conclusion to be drawn from the efforts of more than 130 research projects, covering a period of more than 25 years of research involving more than 500 independent research groups, is that biotechnology, and in particular GMOs, are not per se more risky than conventional plant breeding technologies.”
Although the science is clear, the public remains concerned. A recent Gallup poll found that 48 percent of the American public believed that genetically modified foods posed a serious risk to consumers. Many of those polled wanted foods to contain GMO warning labels so they could know which ones to avoid. This poll showed that not only are we willing to ignore science, but we’re also willing to ignore history. Due to selective breeding and cultivation, the crops we raise today “naturally” have little resemblance to their ancestors. From a practical standpoint, the farmer taking advantage of a chance mutation to cultivate a specific crop is indistinguishable from a choice to create the mutation ourselves. Both have the same mutation.
Genetic modification has also been used to make lifesaving medicines. Insulin used by diabetics, clotting proteins used by hemophiliacs, and human growth hormone used by children with short stature have all been made using genetic engineering technology. Previously, these products were obtained from pig pancreases, blood donors, and the pituitaries of dead people.
Yet those who oppose GMOs persist. Recently, the story of a tomato containing a fish gene made the rounds. The Frankensteinian im
age galvanized environmentalists to push harder to label GMO foods. Steven Novella, an assistant professor at Yale University School of Medicine, and the creator of the podcast, The Skeptics Guide to the Universe, summed it up best: “The real question here is not whether there is a GMO tomato with a fish gene, but who cares?” he wrote. “It’s not as if eating fish genes is inherently risky—people eat actual fish. Furthermore, by some estimates, people share about 70 percent of their genes with fish. You have fish genes and every plant you have ever eaten has fish genes. Get over it!”
The GMO controversy reached its illogical end in 2015, when New York Assemblyman Thomas J. Abinanti introduced Bill 1706, banning all genetically modified vaccines. Not surprisingly, most vaccines are genetically modified. If not, then people would be injected with the “natural” bacteria or viruses that caused the disease. For example, by genetically modifying poliovirus, we’ve eliminated polio from the United States and from much of the world. Vaccines have to be genetically modified.
Perhaps no single chemical has suffered from the zeitgeist more than bisphenol A (BPA).
In 1935, the DuPont chemical company launched its slogan, “Better Living Through Chemistry.” In 1982, DuPont dropped “Through Chemistry” and later abandoned the slogan altogether in favor of “The Miracles of Science.” The word “chemistry” just didn’t sit well with the American public. We seem to respond negatively to anything with a chemical name. And bisphenol A certainly fits that bill.
BPA, which was first synthesized in 1891, wasn’t commercially available in the United States until 1957, when it was used to make plastics and resins. The chemical found its way into goggles, face shields, bicycle helmets, water bottles, baby bottles, CDs, DVDs, the lining of water pipes, and the lining of metal soup and soda cans. Ironically, BPA wasn’t invented to make plastic clear and tough. It was invented as a synthetic estrogen, the hormone primarily responsible for regulating the female reproductive system. But BPA was a weak estrogen—about 40,000 times weaker than other synthetic estrogens—so it was abandoned, only later to be picked up as a plasticizer. What researchers soon discovered, however, was that this weak hormone, although insoluble in water, could leach out of plastic or metal containers. They feared that Americans, including American babies, might unknowingly be ingesting a feminizing hormone.
Concerned that BPA might be harmful, researchers studied its effects on mice and rats, linking it to breast cancer, prostate cancer, early onset puberty, ovarian cysts, obesity, and even attention deficit disorder. Then they started investigating people, finding that 93 percent of adults had traces of BPA in their urine. “If you don’t have BPA in your body,” wrote one Time magazine reporter, “you’re not living in the modern world.”
Armed with this information, Nalgene, which makes plastic containers, removed BPA from all of its products. Then the FDA banned its use in baby bottles. BPA was now, according to one reporter, “among the world’s most vilified chemicals.” Similar to the story of DDT, however, the BPA story also soon fell apart.
Initially, researchers had trouble reproducing the animal model studies, especially when using quantities of BPA likely to be encountered by people. A 2004 report from the Harvard Center for Risk Analysis found “no consistent affirmative evidence for low-dose BPA effects.” Glenn Sipes, who was a co-author of the study, said, “I’ve never had a problem saying that we can see biological effects in these low doses. But why are we seeing these studies that can’t be repeated? Why do we have to work so hard to try and replicate and show these low doses really have an effect? Why don’t [problems with BPA] stand out in black and white?”
In 2011, a review of studies in people found no evidence that low doses of BPA caused harm. The reason that studies in rodents had found that BPA had caused problems was that the rodents had been injected with BPA; injection bypassed the liver, which typically inactivates BPA within five minutes. When rodents were fed BPA instead of being injected with it, those given 40, 400, or 4,000 times the typical human exposure remained healthy.
In July 2014, the FDA stated that, “BPA is safe at the current levels occurring in foods.” Similarly, the European Food Safety Authority, which published recommendations about BPA in 2008, 2009, 2010, 2011, and 2015, also continues to state that BPA is safe. Both agencies have set limits for the tolerable daily intake (TDI) for BPA. To exceed this limit, an average adult would have to ingest about 10,000 times more BPA than one would typically ingest in a day: the equivalent of eating more than 500 cans of soup. Nonetheless, today it’s hard to find a water bottle that doesn’t proudly proclaim “BPA-free” on the label.
Another reason that we should have been suspicious of the BPA studies is that mice aren’t men. All studies of experimental animals should be viewed with caution. For example, in the early 1970s, saccharin was shown to cause bladder cancer in rodents. As a result, all food containing saccharin bore a label warning of its dangers. By 2000, scientists realized that what was happening in rodents wasn’t happening in people. The reason was that, unlike humans, the urine of rodents is highly acidic and contains large quantities of calcium phosphate and proteins. For these reasons, rodents fed saccharin formed microcrystals in their urine that damaged the lining of the bladder, causing bladder cancer. None of these events occurred in people. On December 21, 2000, the FDA removed warning labels from foods containing saccharin.
Also, if you’re going to say that animal studies predict events in people, then we should stop eating chocolate, which can cause heart arrhythmias and occasionally death in dogs. As it turns out, dogs cannot tolerate even small amounts of a substance in chocolate called theobromine. People, on the other hand, can consume much larger quantities of chocolate without getting sick. (I am living proof of this.)
Animal studies can also be misleading for another reason: They can show that something is valuable even when it isn’t. For example, early studies of a vaccine to prevent HIV were promising in experimental mice and monkeys. But studies in people have been far less promising. “Mice lie and monkeys exaggerate,” says University of Pennsylvania vaccine researcher David Weiner.
Our fear of anything with a chemical name isn’t likely to go away anytime soon. A few years ago the comedians Penn and Teller did an experiment. They sent a friend to a fair in California to get signatures on a petition to ban dihydroxymonoxide. Hundreds of people signed the petition, convinced that the chemical was bad for you. Dihydrox means two hydrogen (H) atoms and monoxide means one oxygen (O) atom. The combination, H2O, is water. By using a chemical name, their friend was able to convince hundreds of people to ban water from the face of the Earth.
4. Beware the quick fix.
Like lobotomies, mental institutions bursting at the seams with adult schizophrenia patients are a relic of the past. Schizophrenia has become an outpatient disease. So has autism, arguably the most common psychiatric disorder of children. As a consequence, the pressures to find a cure have shifted from psychiatrists working at public facilities to parents living in private homes. Unfortunately, like their counterparts in the past, parents have become desperate, willing to do anything to relieve the suffering. So, although we may think that gruesome, ill-conceived, medieval therapies like lobotomies are behind us, they’re not.
Children with autism have been put in hyperbaric oxygen chambers, causing intense, painful pressure on their eardrums and, in one case, death. They’ve been given intravenous medicines designed to bind heavy metals, causing another child to die when his heart stopped beating. They’ve been taken to Mexico or other countries for stem cell transplantations. And, perhaps worst of all, they’ve been subjected to a therapy invented by a former Scientologist turned health evangelist named Jim Humble, who calls himself the archbishop of the Genesis II Church of Health. In his online video, Humble claims to be a billion-year-old god from the Andromeda galaxy.
Humble believes that autism—as well as AIDS, malaria, cancer, and Alzheimer’s disease—is caused by worms living in the intestine.
To kill the worms, he invented what he calls the Miracle Mineral Solution or MMS. MMS contains sodium chlorite and citric acid, which combine to form chlorine dioxide, a powerful bleach. MMS, which children swallow or receive as an enema, is now quite popular in the autism community. The problem, apart from the fact that autism isn’t caused by worms, is that even small quantities of MMS can cause nausea, vomiting, diarrhea, intestinal bleeding, respiratory failure, hemolysis (when red blood cells in the bloodstream break apart), and, ironically, developmental delay. In October 2015, one U.S. vendor was sent to prison for selling the product. MMS has been linked to at least one death.
Parents who subject their children to MMS often share their stories online. They write about children crying out in pain. They show pictures of the lining of their children’s intestines that have come out in their stools, believing, wrongly, that they’re worms. They talk about how their children’s hair has fallen out. And they talk about how their children have slowly grown more apathetic, losing any previous emotion. How, as they have chronically poisoned their children with an industrial bleach, their children have quieted down, becoming much easier to handle. In essence, how—as had been the case for lobotomies—they have merely substituted one disorder for another. Still, these parents urge each other on. It’s working, they claim.
The contrasts between lobotomy and MMS therapy are striking. Lobotomies were endorsed by the American Medical Association, the American Psychiatric Association, and the New England Journal of Medicine. MMS therapy has never been endorsed by any professional or medical organization; on the contrary, the FDA has issued a warning against its use. Ice pick lobotomies were invented by a respected neurologist who was a professor at a well-known medical school. MMS therapy was invented by a man claiming to be from a galaxy 2.5 billion light-years from Earth. Frankly, it’s much easier to understand how people could lobotomize their children than squirt a powerful industrial bleach into their children’s mouths and rectums.