By 2030, the global cancer burden is expected to nearly double, with dramatic increases in countries having low rates in the past that have now adapted to fast foods and American eating habits.59
For many years, evidence based on hundreds of studies has shown that the SAD promotes cancer; this is because it is high in both highly refined foods and animal products and low in colorful produce. Study after study show that eating lots of fruits and vegetables protects against cancer. It should be clear to everybody that if we want to avoid getting cancer, we should change our diets.
But wait—when looking at modern studies based on dietary patterns in adult American women, one sees that these patterns have not been so clear. The women in these studies made significant dietary improvements by cutting fat and increasing produce, yet their rates of developing cancer are only slightly reduced. What’s going on here?
The reason why scientists have failed to show a radical effect on reducing cancer from including more healthy plant foods in the diet is that some of the protective effect is blunted when you incorporate these dietary modifications too late in life. Cells are most sensitive to damage when they are young; so for dietary modification to offer the most powerful anticancer effects, to mimic the very low cancer rates we see in areas of the world where people eat more produce, these changes must be started earlier in life.
Today’s science is fascinating. It shows that the major effect of diet as a cancer promoter occurs much earlier in life than anyone has thought before. The first seven years are critical to create or prevent cancer—and we are talking about the most common cancers that generally occur after one is 50 or 60 years old.
Several cancers, especially colon cancer, are associated with obesity, but this association is still not strong. The association becomes powerful when we look at the age when a person becomes overweight. Excess body mass in early childhood is most ominous.60
Recent studies have confirmed the idea that most adult cancers are strongly associated with overeating and increased calorie consumption in children, but especially consumption of empty calories. Although childhood growth and early maturity have been hailed as successes of the twentieth century, the scientific data question these common parental objectives. Childhood diets with lots of milk, cheese, and meat as well as bread, oil, and sweeteners may be effective in producing big adults, but they also are extremely effective in producing sick adults who are prone to cancer.
EARLY MATURATION AND BREAST CANCER
Experimental evidence suggests that the susceptibility of mammary tissue to cancer promoters is greatest in early teenage and early adult life. The time during breast growth and development is a particularly sensitive period in a young woman’s life, affecting the development of breast cancer later in adulthood.
Of particular concern is a pattern linking breast cancer to the early age of puberty we are witnessing in modern times. In the nineteenth century, the average age of puberty, as marked by the onset of menstruation, was 17 years, whereas in the past fifty years in Western industrialized countries, such as the United States, the average age of puberty is 12 years.61 Earlier age of puberty is considered a risk factor for developing breast cancer.62
Endocrinologists are seeing more and more girls with precocious sexual development, even before today’s average age of 12, and medical studies confirm that the trend is real and getting worse. Estrogen unquestionably stimulates the development and growth of breast cancer cells; however, it is the timing of this exposure that is most crucial and highly complicated. But one thing we know for sure is that girls who experience puberty earlier have much higher estrogen levels and maintain these significantly higher levels for many years.63
The heightened levels of estrogen and IGF-1 initiated by poor food choices early in life remain heightened throughout the critical years when the breast tissue is developing and most sensitive to damage. Most dangerous is the diet pattern that combines sweets and meats—in other words, a diet that results in high levels of insulin, IGF-1, and estrogen. The levels of these three hormones, which are affected by diet and body fat correlate well with the geographic distribution of breast cancer worldwide.
It is particularly important to note the most significant age range at which diet most critically affects the age of puberty. A study published in 2000 followed children since birth and reported that the girls who consumed more animal products and fewer vegetables between ages 1 and 8 were prone to early maturation and puberty, but the best predictor was an animal protein–rich diet before age 5.64
Fat cells produce estrogen, so excess fat on the body during childhood results in more estrogen exposure. Higher intake of calories, milk, and total animal protein has been linked to earlier puberty.65 In contrast, a high intake of fiber-rich fruits, beans, and vegetables lowers circulating estrogen levels. Diet can powerfully modulate estrogen levels in childhood. A 2003 study illustrated that 8- to 10-year-old girls closely followed with dietary intervention for seven years dramatically lowered their estrogen levels compared with a control group without dietary modification.66 As we get older, the opportunity to lower this risk diminishes.
This pathological early maturation of today’s children is threatening. Cancer occurrence has been shown to arise many years after precancerous changes occur in the breast. Ominously, these changes are visible more and more in teenagers before their 18th birthdays.67 The evidence is clear that breast cancer is a disease caused disproportionally by childhood and teenage eating practices.
“Let’s stop at “Cancer Queen” so the kids can have one of their ice cream sundaes.”
A 2013 study followed almost one hundred thousand women between the ages of 26 and 46 and found that the younger the women, the greater effect diet had on later breast cancer incidence. These researchers noted that the consumption of dairy foods and meat was best associated with the occurrence of breast cancer.68 Prostate cancer shows the same causative relationship with early life events.69
This does not mean that once we have passed the age of 30 our goose is cooked. But it does mean that moderate improvements in dietary practices incorporated later in life are insufficient to drive cancer rates to very low levels. Nothing less than nutritional excellence is needed to repair the early-life DNA damage and methylation defects that accumulate throughout life from eating the SAD, resulting in higher risks of developing cancer.
The Nutritarian diet, the gold standard of dietary excellence, is specifically designed to dramatically lower cancer risk and extend life span, even in people who have not eaten very healthfully earlier in life. When consumed consistently for many years in adequate quantity and variety, phytochemicals work together to detoxify cancer-causing elements, deactivate free radicals, and enable DNA repair mechanisms.70 Nevertheless, full protection against cancer and the full potential for human longevity can be realized only from a lifetime of healthy eating.
Dietary Influence on Breast and Prostrate Cancer Risk According to Age
Today, new cancer diagnostic techniques are emerging, including blood tests that are able to diagnose breast cancer 10 years or more before mammograms can. Breast cells become abnormal many years—even decades—before mammograms can detect a collection of cells that are large enough to be seen by the human eye. Even at the very early stage with comparatively few abnormal cells, protein markers are shed into the blood. What these new tests are discovering is that about 40 percent of adults in the United States over the age of 40 already have cancer cells in their body.71
This is an important reason why the Nutritarian diet style has become so popular among Americans who are looking to repair such damage and prevent cancer. Now is the time to eat very healthfully—not after a cancer diagnosis. Nevertheless, the younger you begin eating well, and the earlier in the process of cancer development that you initiate an anticancer nutritional protocol, the greater the probability that a disease process or cancer can be reversed.
My philosophy of health and health care involves the underlying principle tha
t genetics play a smaller role in the etiology of most chronic diseases than do environment and nutrition. A properly nourished body is highly resistant to infection, has natural defenses against cancer, and is naturally slim and muscular. It cannot be denied that heart disease, strokes, obesity, and even most common cancers can be avoided and prevented. However, it would most likely take embracing a healthful diet throughout life, not just after age 50, to really reduce cancer to a very rare occurrence.
I claim that nutritional excellence can:
•Prevent high blood pressure and even reverse it in the vast majority of cases
•Prevent type 2 diabetes and even reverse it in the vast majority of cases
•Prevent heart attacks and even reverse advanced heart disease in the vast majority of cases
•Prevent breast cancer, prostate cancer, and colon cancer, if adopted early enough in life, and even reverse these cancers in many cases of early cancer
•Prevent childhood cancer and autism if dietary excellence is adopted early enough before conception
•Improve the overall health, intelligence, and emotional stability and happiness of our population
The fact that these claims seem so radical speaks to how uninformed and misled our nation’s people are—because these claims are not radical, nor are they outrageous. As far as the idea that these claims are too extreme—well, that just speaks to the extreme (unhealthy) nature of the SAD and the extent of people’s ignorance about the critical importance of proper nutrition. When a person develops brain cancer or multiple sclerosis in his or her 30s, or a parent dies in his or her 40s of a heart attack, leaving children with only one parent or no parent at all, eating healthy to prevent such tragedies doesn’t seem so extreme. It only seems extreme because the medical community thrives in a drug culture in which we are taught that drugs are our tickets to health and that the SAD is not the cause of our diseases.
The incidence of obesity and diabetes is still climbing, and it is worse among those of lower economic means and education. The number of kids diagnosed with autism is exploding. More people are developing diabetes at younger ages, and we are seeing more heart attacks and strokes in young people. This represents unacceptable human tragedy and foreshadows higher rates of cancer and dementia, as well as an even greater strain on our costly, ineffective, and inefficient health care system—including our overburdened nursing homes and hospitals. This is the genocide I am talking about. What are we waiting for?—things to get even worse?
CHILDHOOD OBESITY
Sources: Ogden CL, Carroll MD, Lawman HG, et al. Trends in obesity prevalence among children and adolescents in the United States, 1988–1994 through 2013–2014. JAMA. 2016;315:2292–99; Childhood Obesity in the United States, 1976–2008: Trends and Current Racial/Ethnic, Socioeconomic, and Geographic Disparities. Health Resources and Services Administration, Maternal and Child Health Bureau. Rockville, MD: U.S. Department of Health and Human Services, 2010.
TRUE HEALTH CARE IS SELF-CARE
Social norms, economic incentives, and the development of our pharmaceutical-centered approach to chronic disease have encouraged the masses to transfer responsibility for their health to doctors, and to equate health care with medical care. When we absolve ourselves of personal responsibility for our health, and accept the myth that diet doesn’t matter and genes dictate our health future, we lay the groundwork for health tragedies.
The historical rise of a pharmaceutical-based system of health care, instead of one that is based on lifestyle medicine, has been a foundational deficit in our health care system that can never be improved by politicians or governmental regulations. Our present drug-centered health care system has contributed to the large segments of our population who are committing slow suicide with fast foods.
Things might begin to change if hospitals become protective enclaves where not merely smoking is prohibited, but also eating junk food and all foods that include white flour, salt, and oil. This would be a huge change from the current climate, in which hospitals install fast food restaurants on their premises, have fast food kiosks in their lobbies, and serve the likes of pancakes with syrup and white bread to their patients. If all physicians, health care workers, and health authorities in the United States were lean, were very healthy eaters, and advocated healthy living and eating, a trickle-down effect would reinforce the message of the dangers of commercially baked goods, fast foods, fried foods, processed meats, and commercially raised animal products.
There is movement in this direction. The American College of Lifestyle Medicine is a physician-specialty organization, with a rapidly growing membership, that puts lifestyle and nutritional medicine first. Its members generally talk the talk, and walk the walk. Defined: “Lifestyle Medicine involves the therapeutic use of lifestyle, such as a predominately whole food, plant-based diet, exercise, stress management, tobacco and alcohol cessation, and other non-drug modalities, to prevent, treat, and, more importantly, reverse the lifestyle-related, chronic disease that’s all too prevalent.”72
“We have a special today on the big whopping burger with bacon-macaroni and cheese. It comes with a free ride to the hospital!”
This sensible approach, with doctor as teacher and motivator for healthier habits rather than merely prescriber of medication and doer of procedures, is not “alternative medicine” or “holistic medicine”; rather, it is progressive medicine. It is where medicine should have gone—and would have gone—if the financial incentives and political and economic power of the pharmaceutical industry were not so massive and influential. All physicians should receive extensive training in lifestyle medicine and nutritional science, and nutritional healing should be heavily embraced in medical schools and residency programs. This is a critical skill set that is missing in the training of health care professionals. The medical profession should have incorporated it years ago, and should embrace it today.
Doctors should not smoke. They should not drink alcohol; they should not drink soda; they should not eat fast food or junk food. They should be pillars of health in our communities, and they should fight for healthy habits among their patients and in their communities.
You do not have to follow my nutritional advice—that is up to you. My mission is not dependent on what you do, but I am committed to making sure that you, and others, know that you don’t have to be sick: You can live a better, happier life without the fear of heart attacks, strokes, dementia, and even cancer. I hope that becomes your future.
CHAPTER FOUR
THE LESSONS OF HISTORY
Not ignorance, but ignorance of ignorance, is the death of knowledge.
—ALFRED NORTH WHITEHEAD
Our society is a reflection of the foods we eat. This isn’t a new concept. In fact, history has shown us this truth with frightening accuracy. That’s why the pages that follow are as much a history lesson as they are a discussion of nutrition and health. I’ve included the examples you’re about to read because they reveal the ways in which our society has suffered from nutritional deficiency in the past, and why it is so crucial that we change this for the future. I hope that the information in this chapter will lead you to greater understanding of and compassion for those people who are most affected by poor nutritional options, and will serve as a rallying cry for positive change. We cannot let the discoveries of nutritional science and what they reveal about the causes of death and disease remain a conversation “for another time.” The time is now. As you’ll see, the mistakes of history have carried over into our current reality.
THE TRAGEDY OF PELLAGRA
More than one hundred years ago, extreme violence plagued the South, especially against former slaves from the end of the Civil War. For example, the Freedmen’s Bureau has a register of more than a thousand murders from 1865 to 1866 in Texas, mostly against former slaves.1 After the war, racist groups emerged, such as the Ku Klux Klan, the White League, and the Knights of the White Camelia, which aimed to assassinate or intimidate African Amer
icans and the white officials who tried to help them assimilate after they were freed from slavery. Such groups also used violence to prevent black people from voting. Violence was the way some tried to restore a system of white supremacy that was disrupted by the end of slavery.
Coinciding with this increased violence was an increase in cases of a nutrition-based disease called pellagra. And while it may seem peculiar in this context to mention pellagra, this now rare illness played a significant role in the postemancipation era. Pellagra is caused by niacin deficiency and was epidemic in the South. It could cause a form of dementia that made some people chronically depressed and angry and others impulsively violent.
Science ultimately traced the cause of pellagra to the Southern diet at that time. But because of sheer indifference and ignorance, local doctors chose to overlook the obvious connection between unhealthy diet, pellagra, impaired brain function, and violent behavior. Of course, this nutritional deficiency was certainly not the sole cause of violence and hate crimes, but there is no question now that there is a link between the foods people ate and the behaviors that ensued. And the severe and dangerous problems exacerbated by poor nutrition continued needlessly for decades. All the while, a vital lesson about the importance of nutrition was ignored.
Despite being a disease most Americans have never heard of, pellagra created a famous American stereotype. It had four distinct symptoms known as the four Ds: dermatitis, diarrhea, dementia, and death.2 The dermatitis caused areas of the skin exposed to the sun to turn a bright red; one of the etymologies of the term “redneck” is that it referred to poor white people with this condition. Europeans called the rash a Casal necklace, after Gaspar Casal, who first described the disease in 1762. “Pellagra” comes from the Italian phrase pelle agra, meaning sour skin.
Fast Food Genocide Page 10