[2017] Lore of Nutrition: Challenging Conventional Dietary Beliefs

Home > Other > [2017] Lore of Nutrition: Challenging Conventional Dietary Beliefs > Page 48
[2017] Lore of Nutrition: Challenging Conventional Dietary Beliefs Page 48

by Tim Noakes

putrefactive

  none

  small, firm

  rare

  long, capacious

  long, capacious

  vital function

  caecum

  capacious

  vital function

  30% – vital

  fermentative

  vital function

  voluminous

  large amount

  GALL BLADDER

  Size

  Function

  well developed

  strong

  well developed

  strong

  often absent

  weak or absent

  DIGESTIVE ACTIVITY

  From pancreas

  From bacteria

  From protozoa

  Digestive efficiency

  solely

  none

  none

  100%

  solely

  none

  none

  100%

  partial

  partial

  partial

  50% or less

  FEEDING HABITS

  Frequency

  intermittent

  intermittent

  continuous

  SURVIVAL WITHOUT

  Stomach

  Colon and caecum

  Microorganisms

  Plant foods

  Animal protein

  possible

  possible

  possible

  possible

  impossible

  possible

  possible

  possible

  possible

  impossible

  impossible

  impossible

  impossible

  impossible

  possible

  RATIO OF BODY LENGTH TO

  Entire digestive tract

  Small intestine

  1:5

  1:4

  1:7

  1:6

  1:27

  1:25

  Reproduced from W.L. Voegtlin, The Stone Age Diet, New York: Vantage Press, 1975, pp. 44–45

  The benefit of hunting large (but more dangerous) mammals like elephant, rhinoceros, hippopotamus, buffalo and eland is then more easily understood. Miki Ben-Dor has calculated that the body of an eland – the revered prey of the !Kung San – provides the same amount of fat as 24 impalas, even though the body weight of an eland is only 10 times greater than that of an impala. A single hippopotamus provides the fat equivalent of 75 impalas.15

  My conclusion is that humans evolved as obligate fat-eaters, and our biology is dependent on eating diets high in fat and moderate in protein, with carbohydrates providing only that balance of calories that cannot be obtained from readily available fat and protein sources.

  All this evidence demonstrates that fish and animal produce are essential to optimise human brain development. Only fish and animal produce contain the necessary brain-specific nutrients16 in high concentrations; these nutrients are not present in appropriate concentrations in cereals and grains. That is why meat and fish – and not cereals and grains, however much they might be ‘fortified’ – are the only suitable complementary foods.

  This raises the next key question: If humans are big-brained mammals, when in our life cycle do our big brains develop?

  Logically, one might think it occurs in the womb, i.e. before birth. But if this were the case, the diameter of the female pelvis would have had to increase proportionally to allow the passage of a large-brained foetus. And wide-hipped humans would have been less efficient runners, preventing effective persistence hunting in midday heat, another key to our evolution as a species.17 So a better solution had to be found. And this was to concentrate the period of greatest growth of the human brain to within the first 1 000 days of life.

  But there was still one other critical problem that had to be overcome.

  The human brain comprises 60 per cent fat, over 25 per cent of which is cholesterol. The presence of a blood–brain barrier that prevents the passage of large molecules (such as dangerous bacteria or viruses) directly from the bloodstream to the brain also means that any large molecules needed to construct those fats cannot reach the brain directly. The solution that human evolution conferred was to use ketone bodies for this purpose.

  Ketone bodies are produced by the liver whenever blood insulin concentrations are low, and fat, not carbohydrate, is being used as the principal energy fuel. Ketones are small, water-soluble molecules that can cross the blood–brain barrier and be used to build the complex fat molecules (like cholesterol) that comprise a large proportion of the human brain. In this way, ‘Humans co-opted a trait that was previously an adaptation to cope with periods of starvation, into our default metabolism to support brain growth in particular, but also to meet the brain’s ongoing energy requirements.’18

  Figure 16.3

  Change in brain size during the first 10 years of life in chimpanzees (left panel) and humans (right panel). Note that most of the increase in brain size (cranial capacity) in humans occurs during the first four years after birth, but especially during the first two years of life. Crucial to my HPCSA trial was my argument that, during this period, infants need to be weaned onto foods that are high in brain-specific nutrients – namely, animal- and fish-based foods. Reproduced from Amber O’Hearn, ‘Optimal weaning from an evolutionary perspective’19

  Blood-borne ketone bodies are thus essential for human survival and brain growth, a fact that none of the HPCSA’s expert witnesses seemed to understand. Instead, all expressed the false belief that ketones are harmful because of their association with diabetic ketoacidosis, a condition with which each ‘expert’ seemed to have a shallow acquaintance. It is clear that doctors and dietitians in South Africa are not taught that ketosis is healthy and the natural state of the newborn, and crucial for the full development of the large human brain.

  It follows that during the first 1 000 days of life, the infant must be fed a diet that is rich in brain-essential nutrients. In addition, the diet must ensure that sufficient ketone bodies are produced to fuel this period of rapid brain growth. As I argued in both Raising Superheroes and the HPCSA trial, brain-essential nutrients are not found in maize and other cereals onto which South Africans are encouraged to wean their babies. In Raising Superheroes I also presented evidence that the human newborn has a high capacity to both produce and metabolise ketone bodies, perhaps even better than fat-adapted adults.

  For Claire Julsing Strydom to suggest that ketosis is dangerous for newborns and might even be ‘life-threatening’, and for the HPCSA then to prosecute me on that basis, exposes a very worrying level of ignorance in the South African medical and dietetics professions.

  In summary, we can conclude that two million years ago our hominin ancestors had already mastered the art of killing and eating large mammals on the African savannah, with little more than stone implements. These skills were essential for the conversion of humans from herbivory to carnivory. In the process, they evolved from small- to large-brained mammals capable of prodigious mental achievements.

  A foregut fermenter that has adapted to carnivory; a carnivore that has failed to adapt to herbivory

  Two mammals show that it is easier for the mammalian intestine to convert from herbivory to carnivory than to do the reverse. Dolphins are sea-living, fish-eating carnivores whose closest relatives are thought to be land mammals, specifically the hippopotamuses and other hoofed ruminants, including camels, pigs and giraffes, from whom they diverged about 50 million years ago when some enterprising dolphin ancestor decided she would prefer to live in the sea. While modern dolphins retain anatomical evidence of a four-chambered (ruminant) stomach, it no longer operates as a fermenting vat filled with anaerobic bacteria. Instead, the dolphins’ herbivore-like stomach now works exactly as do the stomachs of other carnivores such as wolves, lions and humans.

  Contrast this to the experience of the panda bear. Bears have the digestive tract of ca
rnivores, although most bears (other than polar bears) are omnivores. Lacking a long large bowel specifically designed for fermentation, the omnivorous bears compensate by eating more plant material and supplementing with meat and fish when available. For the past two million years or so, however, panda bears have lived in an environment in which bamboo makes up 99 per cent of their diet. Yet in that time, the panda has not adapted in any way to herbivory.

  In particular, the gut of the panda bear does not contain the cellulose-digesting bacteria present in abundance in the intestines of fermenting herbivores. Instead, ‘the giant panda appears not to have evolved a gut microbiota compatible with its newly adopted diet, which may adversely influence the co-evolutionary fitness of this herbivore’.20 So ‘the giant panda still retains a gastrointestinal tract typical of carnivores. The animals also do not have the genes for plant-digesting enzymes in their own genome. This combined scenario may have increased their risk for extinction.’21 Scientists speculate that it was the unfortunate loss of one of the genes driving meat-eating – the umami taste receptor – that may have converted pandas to herbivory.

  As a result of this failure to adapt to herbivory, pandas must spend up to 14 hours a day eating up to 13 kilograms of leaves and stems of which they digest only about 17 per cent. Because so little of this material is digested, pandas must pass voluminous faeces – as many as 40 times per day. In addition, they are prone to develop irritable bowel syndrome: ‘They get stomach cramps, go off their food and lie in a heap for a few days.’22

  Their nutritionally poor diet explains why panda bears reproduce so poorly – female pandas ovulate only once a year and are fertile for about three days; why they limit their social interactions; why they must spend up to 12 hours a day sleeping; and why they avoid walking up steeply sloping terrain. Their somnolence gives ‘the impression of an animal that eats purely to have enough energy to carry on eating’.23

  This is the natural consequence of eating a nutrient-poor, exclusively plant-based diet.

  By leaving behind the security of the forests and jungles, our hominin ancestors embarked on the journey that would ultimately lead to us, Homo sapiens.

  When humans began their conversion to carnivory two to three million years ago, they no longer needed bacteria to enhance the quality of the nutrient-poor, plant-based diet they had eaten in the past. Perhaps our ancestors realised that rather than eating a nutrient-poor diet and having to eat and defecate all day, it would be much more efficient to leave that to the experts – cellulose-fermenting herbivores – and then simply catch and eat those experts.

  The change to a nutrient-dense diet removed the need for a long and voluminous fermenting large bowel, producing humans with a much-reduced abdominal cavity. As a result, our hips became narrower, allowing more efficient running (as our knees were closer together). With time and additional adaptations, as described in my book Waterlogged,24 humans became the perfect two-legged running mammal, able to chase even four-legged ruminants to their exhaustion in midday heat. In time, the development of highly mobile shoulders allowed humans to become adept throwers of spears and other stone implements, so that running in the heat became a less important method of hunting. The end result was that the more nutrient-dense foods our human ancestors ate, the larger our energy-demanding brains became.

  I argue that these anatomical changes – but most especially our conversion from hindgut fermenting hominins to carnivorous Homo sapiens dependent only on a functioning small bowel for the digestion and absorption of our nutrient-dense, predominantly animal-based diet – adequately explain why the obesity/T2DM epidemic was bound to happen when modern humans switched to a diet that derives most of its energy from carbohydrates.

  And that is exactly what happened after 1977 when we were told that, to protect ourselves from heart attack, we had to replace most of the fat in our diet with carbohydrates.

  Unfortunately, those who advised us to undertake this disastrous experiment failed to understand that the carnivore’s digestive tract is simply not designed to cope with a high-carbohydrate diet. And it was not as if we had not already been warned by the first disastrous human dietary experiment, courtesy of the agricultural revolution.

  The agricultural revolution

  Beginning about 12 000 years ago, humans living in the Fertile Crescent in Western Asia, including the areas surrounding the Tigris, Euphrates and Nile rivers, began to domesticate animals and cultivate grains for the first time. The addition of cereals and grains to the human diet reduced our reliance on hunting, fishing and gathering. A secure source of storable, year-round food allowed the growth of stable communities living together in towns and villages.

  Jared Diamond suggests that, for the future of the earth and the human species, agriculture was ‘the worst mistake in the history of the human race’.25 According to him, the adoption of farming produced a number of serious disadvantages, including starvation, epidemic diseases and malnutrition. By 3 000 BC, humans living on cereals and grains had lost at least five inches in height; some peoples, the Greeks and Turks in particular, have still to regain the average heights of their pre-agricultural relatives. But perhaps worst of all, agriculture produced deep class divisions, as it allowed some to accumulate wealth by storing food.

  Figure 16.4

  The spread of agriculture since 8 000 BC. Dates indicate achievement of food production by some people in the region. Arrows show movement of imported domesticated food products. Note that wheat and barley were first domesticated in the Fertile Crescent, whereas maize, beans, potatoes and squashes come from Central and South America. Rice, millet, soybeans, sorghum and hemp are from North China. Redrawn from ‘Agriculture slowly spreads’26

  Similarly, in his book Sapiens, Yuval Noah Harari devotes an entire chapter to the agricultural revolution, which he calls ‘history’s biggest fraud’. He argues that wheat domesticated humans, not the reverse (to wheat’s, not humans’, advantage), so that: ‘This is the essence of the Agricultural Revolution: the ability to keep more people alive under worse conditions.’27

  The precise reason why humans turned to cereals and grains as a food source after millions of years of hunting is contested, and at least three separate theories have been advanced. The first is that human hunters were so successful that they exterminated the once-plentiful animals on which they lived, especially those with the highest body-fat contents.28 Thus they had to find an alternative source of energy to compensate for the ‘fat gap’ in their daily energy intakes. The second is that humans fell for some or other addictive chemicals present in grains. This is compatible with Harari’s idea that grains domesticated humans, rather than the other way around. The third theory is that cereals and grains were grown to entice wild animals so that they could be more easily captured.

  With time, the cultivation of cereals spread east and west (see Figure 16.4), in part because cultivation destroyed the soil of the Fertile Crescent so that new, still-fertile lands had to be found.

  Figure 16.5

  The last ice age lasted from 114 000 to about 10 000 years ago, during which much of North America, Europe, including most of the British Isles, and Asia were under sheets of ice up to a mile thick. Redrawn from A. Watts, ‘90% of the last million years, the normal state of the Earth’s climate has been an ice age’29

  So for people like myself, with Western European ancestry, the first exposure of our ancestors to dietary cereals and grains can only have occurred within the last 10 000 years. Until then, our (Western) European ancestors most likely lived mainly on the woolly mammoths, whose distribution matched almost perfectly the distribution of the ice cap,30 but which were hunted to extinction as recently as 4 000 years ago.

  As described in The Real Meal Revolution, the Egyptians were one of the first to adopt wheat as a dietary staple, to the extent that they were nicknamed ‘artophagoi’, meaning eaters of bread. Comprising primarily carbohydrates (bread, fruits, vegetables, honey), oils (olive, flaxseed, safflow
er, sesame), goat’s milk and cheese, fish, waterfowl and occasional red meat, their diet was an almost perfect example of the kind – low in saturated fat and cholesterol – that would be prescribed as ideal in the 1977 US dietary guidelines.

  As Dr Michael Eades points out, if such is the ultimately healthy diet, then the ‘ancient Egyptians should have lived forever or at least should have lived long, healthy lives and died of old age in their beds. But did they?’31

  Eades provides evidence to answer his own question: ‘So, a picture begins to emerge of an Egyptian populace, rife with disabling dental problems, fat bellies and crippling heart disease … sounds a lot like the afflictions of millions of people in America today, doesn’t it? The Egyptians didn’t eat much fat, had no refined carbohydrates … and ate almost nothing but whole grains, fresh fruits and vegetables, and fish and fowl, yet were beset with all the same diseases that afflict modern man. Modern man, who is exhorted to eat lots of whole grains, fresh fruits and vegetables, to prevent or reverse these diseases.’ Eades concludes that this historical evidence might suggest that ‘there are some real problems with the low-fat, high carbohydrate diet’.33

  Figure 16.6

  Fossil evidence showing a reduction in human skull size in the past 10 000 years. Redrawn from M. Henneberg, ‘Decrease of human skull size in the Holocene’32

  Interestingly, this grain-based diet did not prevent the development of arterial disease. In a treatise published in 1911, Sir Marc Armand Ruffer wrote: ‘I cannot therefore at present give any reason why arterial disease should have been so prevalent in ancient Egypt. I think, however, that it is interesting to find that it was common, and that three thousand years ago it represented the same anatomical characteristics as it does now.’34

  Another cost of adopting a cereal-based diet may be that human brain size has decreased progressively in the past 10 000 years. The decline began at precisely the moment when humans began to raise their children on cereals and grains and less animal produce. This possibility should be of interest to those involved in my HPCSA trial, as should the story of how maize, a nutrient-poor food that does not contain all the brain-specific nutrients, became the staple ‘complementary food’ for infant weaning in South Africa.

 

‹ Prev