Book Read Free

An Elegant Defense

Page 19

by Matt Richtel


  This philosophy is increasingly widely held. It is called the hygiene hypothesis, and the broad idea is that we are starving our immune systems of training and activity by an excessive obsessive focus on cleanliness.

  “I tell people, when they drop food on the floor, please pick it up and eat it. Get rid of the antibacterial soap. Immunize! If a new vaccine comes out, run and get it. I immunized the living hell out of my children. And it’s okay if they eat dirt. We have animals in our homes, and they sleep with us. If your dog shits on the floor, clean it up, of course, but don’t use bleach. You should not only pick your nose, you should eat it.”

  Seriously?

  Yeah, Dr. Lemon says, why not?

  “Our immune system needs a job. We evolved over millions of years to have our immune systems under constant assault. Now they don’t have anything to do.”

  Our elegant defense has grown restless.

  “But it’s a hard discussion to have with patients. They’ve been brainwashed that they have a weak immune system. People look at me like I’m crazy.”

  The numbers of people suffering some common autoimmunity and allergies have increased sharply.

  Evidence is mounting about how the balance of our immune system has changed—how the modern world has upset it.

  Is Dr. Lemon crazy? Should you pick your nose?

  Here I will briefly turn the focus to four major factors of day-to-day life that impact autoimmunity and immunity broadly, in the lives of Linda and Merredith, Bob, Jason, and you and me. These four factors are sleep, stress, the gut, and hygiene.

  All these roads will eventually lead us back to Jason, and the epic battle waged in his life’s festival.

  32

  Should You Pick Your Nose?

  Don’t laugh. These are now serious questions. Should you pick your nose? Should your children pick their noses?

  “I don’t know. It might have some negative social consequences,” one epidemiologist told me. She was quite serious: The biggest downside to nose-picking (and eating) might be the social consequences. Could it actually be a health advantage?

  Should your children eat dirt? Maybe.

  Should you use antibacterial soap or hand sanitizers? No.

  Are we taking too many antibiotics? Yes.

  For more complete answers, let us turn to nineteenth-century London.

  The British Journal of Homeopathy, volume 29, published in 1872, includes a startlingly prescient observation about hay fever: “Hay fever is said to be an aristocratic disease, and there can be no doubt that, if it is not almost wholly confined to the upper classes of society, it is rarely, if ever, met with but among the educated.”

  Hay fever is a catchall term for seasonal allergies to things like pollen and other airborne irritants. This nineteenth-century essay, incidentally, says it can be difficult to distinguish between hay fever and asthma or rheumatism. This is worth noting because these turn out to be autoimmune disorders, and allergies wind up as a close cousin. The immune system is overreacting.

  With this idea that hay fever was an aristocratic disease, the British scientists were on to something.

  More than a century later, in November 1989, another highly influential paper was published on the subject of hay fever. The paper was short, less than two pages, in BMJ, titled “Hay Fever, Hygiene, and Household Size.” The author looked at the prevalence of hay fever among 17,414 children born in March of 1958. Of sixteen variables the scientist explored, he described a “most striking” association between the likelihood that a child would get hay fever allergy and the number of his or her siblings. It was an inverse relationship, meaning the more siblings the child had, the less likely it was that he or she would get the allergy. Not just that, but the children least likely to get allergies—also known as atopic diseases—were ones who had older siblings.

  The paper hypothesized that “allergic diseases were prevented by infection in early childhood transmitted by unhygienic contact with older siblings or acquired prenatally from a mother infected by contact with her older children.

  “Over the past century declining family size, improvements in household amenities, and higher standards of personal cleanliness have reduced the opportunity for cross infection in young families,” the paper reads. “This may have resulted in more widespread clinical expression of atopic disease, emerging in wealthier people, as seems to have occurred in hay fever.”

  This is the birth of the hygiene hypothesis. It provides one of the most telling and vivid insights into the challenges that human beings face in our relationship with the modern world. In a nutshell, that challenge revolves around the idea that we evolved over millions of years to survive in the environment around us. For most of human existence, that environment was characterized by extreme challenges, like scarcity of food or food that could carry disease, unsanitary conditions and unclean water, withering weather, and on and on. It was a very dangerous environment, a heck of a thing to survive.

  At the center of our defenses was our immune system. These defenses are the product of the millennia of evolution, the way a river stone is shaped by the water rushing over it and the tumbles it experiences on its wayward journey downstream.

  Along the way, we humans learned to take steps to bolster our defenses. Prior to the discovery of medicines, we developed all manner of custom and habit to support our survival. In this way, think of the brain—the organ that helps us develop habits and customs—as another facet of the immune system. For instance, we used our collective brains to figure out effective behaviors. We started washing our hands or took care to avoid certain foods that could be dangerous or deadly. Some cultures avoid pork, which is highly susceptible to trichinosis; others banned meats, with their toxic loads of E. coli. Ritual washing gets mentioned in Exodus, one of the earliest books in the Bible: “So they shall wash their hands and their feet, so they will not die.”

  Our ideas evolved, but for the most part, our immune system did not. This is not to say that our immune system didn’t undergo change. The immune system responds to our environment and learns. This is central to the branch of the immune system known as the adaptive immune system. Our immune system comes into contact with various threats, develops an immune response, and then is much more able to deal with that threat in the future. In that way, we adapt to our environment.

  But adaptation is not the same thing as evolution. Adaptation involves responding to the environment within the limits of individual physical capacities. To take a random example: If you learn that you are more likely to catch a bird if you hunt at dawn, you will wake up early and go hunting. You are adapting to your environment. By contrast, evolution involves fundamentally changing our physical capacities over the course of many generations. Evolution, in this case, might optimize our bird-catching ability by the development of wings. For humans to become winged creatures, it would take eons.

  What does this have to do with your immune system and allergies? Lots.

  To survive, we adapted within our physical capacities. We washed our hands, swept our floors, cooked our food, or avoided certain foods altogether. We learned and adapted.

  Then our learning and adaptation began to intensify as we built quickly upon past discoveries. Human discoveries came in leaps and bounds. We developed medicines like vaccines and antibiotics. Virtually overnight, we changed the environment with which our immune system interacted. We improved the hygiene of the animals we raised and slaughtered for food, and that of our crops and kitchens. Particularly in the wealthier areas of the world, we purified our water, developed plumbing and water and human waste treatment plants; we isolated and killed bacteria and other germs. But for the most part, our immune system continues to be the same one humans have always had. It had developed and evolved to allow us to survive in a particular type of environment—one teeming with pathogens. On one level, we had given a major helping hand to our immune system. Its enemies list was attenuated. On another level, though, our immune system is
proving that it cannot keep up with this change.

  A 1921 Lysol Disinfectant ad. Germ killing has been great for business, mixed for public health.

  At a core level, we have created a mismatch between our immune system—one of the longest surviving and most refined balancing acts in the world—and our environment. Thanks to all the powerful learning we’ve done as a species, our immune system isn’t getting the regular interaction with germs that helped to teach and hone it—that “trained” it. It doesn’t encounter as many bugs when we are babies. This is not just because our homes are cleaner, but also because our families are smaller (fewer older kids to bring home the germs), our foods and water cleaner, our milk sterilized, and on and on.

  What does an immune system do when it’s not properly trained?

  It overreacts. It becomes aggrieved by things like dust mites or pollen. It develops what we called allergies, chronic immune system attacks—inflammation—in a way that is counterproductive, irritating, even dangerous. There has been a rise in autoimmunity too.

  The numbers are significant.

  The percentage of children in the United States with a food allergy rose 50 percent between 1997–1999 and 2009–2011, according to the Centers for Disease Control and Prevention.

  Of similar magnitude, the jump in skin allergies was 69 percent during that period, resulting in 12.5 percent of American children with eczema and other irritations.

  In keeping with the themes mentioned earlier in this chapter, food and respiratory allergies rose with income level. More money, which typically correlates with higher education, meant more risk of allergy. This could reflect differences in who reports such allergies but also differences in environment.

  These trends are seen internationally too. Skin allergies “doubled or tripled in industrialized countries during the past three decades, affecting 15–30 percent of children and 2–10 percent of adults,” according to a paper published by the British Society for Immunology. Asthma, the paper reads, “is becoming an ‘epidemic’ phenomenon.”

  By 2011, one in four children in Europe had an allergy, and the figure was on the rise, according to a report by the World Allergy Organization. Reinforcing the hygiene hypothesis, the paper noted that migration studies have shown that some types of both allergy and autoimmunity rise as people move from poorer to richer countries. The prevalence of diabetes is higher in Pakistanis who move to the United Kingdom than in those who remain in Pakistan. The incidence of lupus is higher among African Americans than West Africans, the paper notes.

  There are similar trends with inflammatory bowel disease, lupus, rheumatic conditions, and in particular, celiac disease. That entails the immune’s system overreacting to the protein molecule in gluten. This attack, in turn, damages the walls of the small intestine. This might sound like a food allergy, but it is different in part because of the symptoms. In the case of an autoimmune disorder like this one, the inflammation happens in the area of insult; the immune system attacks the protein and associated regions.

  Allergies can generate a more generalized response. A peanut allergy, for instance, can lead to inflammation in the windpipe, known as anaphylaxis, which can cause strangulation.

  In the case of both allergy and autoimmunity, though, the immune system reacts more strongly than it otherwise might, or than is “healthy” for the host (yeah, I’m talking about you).

  This is not to say that all of these increases are due to better hygiene, a drop in childhood infection, and its association with wealth and education. There have been many changes to our environment, including new pollutants. There are absolutely genetic factors as well. But the hygiene hypothesis—and when it comes to allergy, the inverse relationship between industrialized processes and health—prevails.

  An instructive study has to do with the Amish.

  The Amish are not known for a tendency to lend excitement to most proceedings, but this is the kind of study that gets researchers all fired up. The study looked at the prevalence of allergy among two communities, one Amish in Indiana and the other Hutterite in South Dakota. Why is this particular study so exciting to scientists? It’s because these two groups have remained relatively isolated since they moved to the United States several hundred years ago (the Amish in the 1700s from Switzerland, and the Hutterites in the 1800s from South Tyrol, which borders Switzerland in northern Italy). The upshot: They descend from relatively similar genetic stock and have like-minded approaches to things known to impact allergy, including large family size, high rates of vaccination, and, notes the study, “taboos against indoor pets.” Ah, but livestock. No taboos there.

  This is both a similarity and the key difference.

  The Amish in the study “practice traditional farming, live on single-family dairy farms, and use horses for fieldwork and transportation. The Hutterites live on large, highly industrialized, communal farms.”

  There is another major difference, this having to do with the prevalence of allergy. Only 5 percent of Amish schoolchildren suffered asthma, compared to 21 percent for the Hutterites.

  On a slightly lesser measure of sensitivity—called allergic sensitization—7 percent of Amish kids qualified, compared to 33 percent for Hutterities.

  The researchers asked what was causing two groups of people with very similar genetic backgrounds, similarly isolated from other groups culturally and environmentally, to have such different allergy profiles.

  One powerful clue the researchers discovered was that the households of the Amish were much more likely to have allergens, “from cats, dogs, house-dust mites, and cockroaches.” Forty percent of Amish homes had them, compared to 10 percent for the Hutterites. You’d think you’d rather live in the Hutterite household, right?

  We’re just getting started.

  Residue from bacteria, the kind that cause disease, was nearly seven times higher in the Amish homes.

  Now the twist. Researchers looked inside the bodies of the Amish subjects and found evidence that turns knee-jerk disgust on its head. The Amish children had a higher proportion of immune system cells called neutrophils. Remember these? They are front-line fighters.

  Among the Amish, there was also a relatively lower proportion of eosinophils. These are another kind of white blood cell; they are solid, all-purpose fighters essential to destroying viruses, bacteria, and parasites. They can cause inflammation, and that, as you know now, is a double-edged sword. In fact, they are highly associated with allergy and autoimmunity when these numbers are elevated; in excess, they can be markers of asthma and eczema, lupus, Crohn’s disease, and other conditions.

  The Amish and Hutterites were subjected to a type of bacteria known to elicit a strong immune system response measured by the levels of cytokines, like interferon, interleukins, and such. Overall, the bacteria elicited the same twenty-three cytokines, but in lower proportions in the Amish.

  “As compared with the Hutterites, the Amish, who practice traditional farming and are exposed to an environment rich in microbes, showed exceedingly low rates of asthma and distinct immune profiles that suggest profound effects on innate immunity,” reads the paper in The New England Journal of Medicine.

  The researchers then did studies in mice to try to reproduce the results. They did studies that showed that mice raised in relatively microbe-rich environments, like the Amish, developed immune systems that were more effective in key ways than mice raised in Hutterite-like environments.

  I’m going to quote the study in all its scientific-vernacular glory, partly because readers who have come this far have earned the ability to grasp most of it.

  The concordance between findings from studies in humans and in mice was remarkable: in both studies protection was accompanied by lower levels of eosinophils, higher levels of neutrophils, generally suppressed cytokine responses, and no increase in levels of T regulatory cells or interleukin-10. Thus, the finding that these features were largely dependent on innate immune pathways in mice suggests that innate immune sig
naling may also be the primary target of protection in the Amish children, in whom downstream adaptive immune responses may also be modulated.

  Now for the plain-English version. The dust and pet filth, the cockroach miasma, and the barnyard residue, far from being an enemy, impacted the immune system through both pathways, innate and adaptive, and the Amish kids were far less likely to get an allergy.

  So maybe you should pick your nose and eat what you extract? The study doesn’t speak to that. But it might explain the urge we sometimes get. Maybe we’re shoving a few germs up the nostril to test the system, the same way that kiddos put lots of things in their mouths. During the research for this book, a well-known immunologist told me that children should “eat a pound of dirt a day.” He was being somewhat glib, but you can now get his point.

  A lot of products have been marketed to suggest otherwise.

  When I was a kid, I collected Wacky Packages. They were packs of trading cards and stickers that made fun of major brand names. Milk-Bone for dogs was rendered as Milk Foam, and Band-Aid was Band-Ache. Each pack came with a rectangular stick of pink gum that was almost certainly manufactured in the 1700s.

  Among the products that made the cut for Wacky Packages were multiple hygiene and cleaning products: Windhex (Windex); Ajerx (Ajax); Toad (Tide detergent).

  No wonder. These kinds of products were heavily advertised during a surge in hygiene-related marketing that began in the late 1800s, according to another novel study published in 2001 by the Association for Professionals in Infection Control and Epidemiology. You heard right; researchers from Columbia University, who did the research, were trying to understand how we became so enamored with soap products. Some highlights:

  The Sears catalogue in the early 1900s heavily advertised “ammonia, Borax, and laundry and toilet soap.”

 

‹ Prev