Penny le Couteur & Jay Burreson
Page 5
A similar set of reactions, in a somewhat different order, is the basis for the modern synthetic method (also from glucose) for the industrial preparation of ascorbic acid. The first step is an oxidation reaction, meaning that oxygen is added to a molecule, or hydrogen is removed, or possibly both. In the reverse process, known as reduction, either oxygen is removed from a molecule, or hydrogen is added, or again possibly both.
The second step involves reduction at the opposite end of the glucose molecule from that of the first reaction, forming a compound known as gulonic acid. The next part of the sequence, the third step, involves gulonic acid forming a cyclic or ring molecule in the form of a lactone. A final oxidation step then produces the double bond of the ascorbic acid molecule. It is the enzyme for this fourth and last step that humans are missing.
The initial attempts to isolate and identify the chemical structure of vitamin C were unsuccessful. One of the major problems is that although ascorbic acid is present in reasonable amounts in citrus juices, separating it from the many other sugars and sugarlike substances that are also present in these juices is very difficult. It’s not surprising, therefore, that the isolation of the first pure sample of ascorbic acid was not from plants but from an animal source.
In 1928, Albert Szent-Györgyi, a Hungarian doctor and biochemist working at Cambridge University in England, extracted less than a gram of crystalline material from bovine adrenal cortex, the inner fatty part of a pair of endocrine glands situated near a cow’s kidneys. Present at only about 0.03 percent by weight in his source, the compound was not initially recognized as vitamin C. Szent-Györgyi thought he had isolated a new sugarlike hormone and suggested the name ignose, the ose part being the ending used for names of sugars (like glucose and fructose ) and the ig part signifying that he was ignorant of the substance’s structure. When Szent-Györgyi’s second suggestion for a name, God-nose, was also rejected by the editor of the Biochemical Journal (who obviously did not share his sense of humor), he settled for the more sedate name hexuronic acid. Szent-Györgyi’s sample had been pure enough for accurate chemical analysis to show six carbon atoms in the formula, C6H8O6, hence the hex of hexuronic acid. Four years later it was shown that hexuronic acid and vitamin C were, as Szent-Györgyi had come to suspect, one and the same.
The next step in understanding ascorbic acid was to determine its structure, a task that today’s technology could accomplish relatively easily using very small amounts but was nearly impossible in the absence of large quantities in the 1930s. Once again Szent-Györgyi was in luck. He discovered that Hungarian paprika was particularly rich in vitamin C and, more important, was particularly lacking in other sugars that had made the compound’s isolation in fruit juice such a problem. After only one week’s work he had separated over a kilogram of pure vitamin C crystals, more than sufficient for his collaborator, Norman Haworth, professor of chemistry at the University of Birmingham, to begin the successful determination of the structure of what Szent-Györgyi and Haworth had now termed ascorbic acid. In 1937 the importance of this molecule was recognized by the scientific community. Szent-Györgyi was awarded the Nobel Prize for medicine for his work on vitamin C, and Haworth the Nobel Prize for chemistry.
Despite more than sixty years of further work, we are still not completely sure of all the roles that ascorbic acid plays in the body. It is vital for the production of collagen, the most abundant protein in the animal kingdom, found in connective tissues that bind and support other tissues. Lack of collagen, of course, explains some of the early symptoms of scurvy: the swelling of limbs, softening of gums, and loosening of teeth. As little as ten milligrams a day of ascorbic acid is said to be sufficient to keep the symptoms of scurvy at bay, although at that level sub-clinical scurvy (vitamin C deficiency at the cellular level but no gross symptoms) probably exists. Research in areas as varied as immunology, oncology, neurology, endocrinology, and nutrition is still discovering the involvement of ascorbic acid in many biochemical pathways.
Controversy as well as mystery has long surrounded this small molecule. The British navy delayed implementing James Lind’s recommendations by a scandalous forty-two years. The East India Company purportedly withheld antiscorbutic foods on purpose in order to keep its sailors weak and controllable. At present there are debates on whether megadoses of vitamin C play a role in treatment of a variety of conditions. Linus Pauling was recognized in 1954 with the Nobel Prize in chemistry for his work on the chemical bond and again in 1962 with the Nobel Peace Prize for his activities opposing the testing of nuclear weapons. In 1970 this double Nobel laureate released the first of a number of publications on the role of vitamin C in medicine, recommending high doses of ascorbic acid for the prevention and cure of colds, flu, and cancer. Despite his eminence as a scientist, the medical establishment has not generally accepted Pauling’s views.
The RDA (Recommended Daily Allowance) of vitamin C for an adult is generally given as sixty milligrams per day, about that found in a small orange. The RDA has varied over time and in different countries, perhaps indicating our lack of understanding of the complete physiological role of this not-so-simple molecule. It is agreed that a higher RDA is necessary during pregnancy and breast-feeding. The highest RDA is recommended for older people, a time when vitamin C intake is often reduced through poor diet or lack of interest in cooking and eating. Scurvy today is not unknown among the elderly.
A daily dose of 150 milligrams of ascorbic acid generally corresponds to a saturation level, and further intake does little to increase the ascorbic acid content of blood plasma. As excess vitamin C is eliminated through the kidneys, it has been claimed that the only good done by megadoses is to provide profits for pharmaceutical companies. It does seem, however, that higher doses may be necessary under circumstances such as infection, fever, wound recovery, diarrhea, and a long list of chronic conditions.
Research continues into the role of vitamin C in more than forty disease states; bursitis, gout, Crohn’s disease, multiple sclerosis, gastric ulcers, obesity, osteoarthritis, Herpes simplex infections, Parkinson’s, anemia, coronary heart disease, autoimmune diseases, miscarriages, rheumatic fever, cataracts, diabetes, alcoholism, schizophrenia, depression, Alzheimer’s, infertility, cold, flu, and cancer, to name just some of them. When you look at this list, you may see why this molecule has sometimes been described as “youth in a bottle,” although research results do not as yet support all the miracles that have been claimed.
Over fifty thousand tons of ascorbic acid are manufactured annually. Produced industrially from glucose, synthetic vitamin C is absolutely identical in every way to its natural counterpart. There is no physical or chemical difference between natural and synthetic ascorbic acid, so there is no reason to buy an expensive version marketed as “natural vitamin C, gently extracted from the pure rose hips of the rare Rosa macrophylla, grown on the pristine slopes of the lower Himalayas.” Even if the product did originate at this source, if it is vitamin C, it is exactly the same as vitamin C that has been manufactured by the ton from glucose.
This is not to say that manufactured vitamin pills can replace the natural vitamins in foods. Swallowing a seventy-milligram ascorbic acid pill may not produce quite the same benefits as the seventy milligrams of vitamic C obtained from eating an average-sized orange. Other substances found in fruits and vegetables, such as those responsible for their bright colors, may help the absorption of vitamin C or in some way, as yet unknown, enhance its effect.
The main commercial use of vitamin C today is as a food preservative, where it acts as an antioxidant and an antimicrobial agent. In recent years food preservatives have come to be seen as bad. “Preservative free” is shouted from many a food package. Yet without preservatives much of our food supply would taste bad, smell bad, be inedible, or even kill us. The loss of chemical preservatives would be as great a disaster to our food supply as would the cessation of refrigeration and freezing.
It is possible to safely preserve f
ruit in the canning process at the temperature of boiling water, as fruit is usually acidic enough to prevent the growth of the deadly microbe Clostridium botulinum. Lower-acid-content vegetables and meats must be processed at higher temperatures to kill this common microorganism. Ascorbic acid is often used in home canning of fruit as an antioxidant to prevent browning. It also increases acidity and protects against botulism, the name given to the food poisoning resulting from the toxin produced by the microbe. Clostridium botulinum does not survive inside the human body. It is the toxin it produces in improperly canned food that is dangerous, although only if eaten. Tiny amounts of the purified toxin injected under the skin interrupt nerve pulses and induce muscle paralysis. The result is a temporary erasing of wrinkles—the increasingly popular Botox treatment.
Although chemists have synthesized many toxic chemicals, nature has created the most deadly. Botulinum toxin A, produced by Clostridium botulinum, is the most lethal poison known, one million times more deadly than dioxin, the most lethal man-made poison. For botulinum toxin A, the lethal dose that will kill 50 percent of a test population (the LD50) is 3 x 10-8 mg per kg. A mere 0.00000003 milligrams of botulinum toxin A per kilogram of body weight of the subject is lethal. For dioxin, the LD50 is 3 x 10-2 mg per kg, or 0.03 milligrams per kilogram of body weight. It has been estimated that one ounce of botulinum toxin A could kill 100 million people. These numbers should surely make us rethink our attitudes toward the perceived evils of preservatives.
SCURVY ON ICE
Even in the early twentieth century a few Antarctic explorers still supported theories that putrefaction of preserved food, acid intoxication of the blood, and bacterial infections were the cause of scurvy. Despite the fact that compulsory lemon juice had virtually eliminated scurvy from the British navy in the early 1800s, despite observations that Eskimos in the polar regions who ate the vitamin C-rich fresh meat, brain, heart, and kidneys of seals never suffered from scurvy, and despite the experience of numerous explorers whose antiscorbutic precautions included as much fresh food as possible in the diet, the British naval commander Robert Falcon Scott persisted in his belief that scurvy was caused by tainted meat. The Norwegian explorer Roald Amundsen, on the other hand, took the threat of scurvy seriously and based the diet for his successful South Pole expedition on fresh seal and dog meat. His 1911 return journey to the pole, some fourteen hundred miles, was accomplished without sickness or accident. Scott’s men were not so fortunate. Their return journey, after reaching the South Pole in January 1912, was slowed by what is now thought to be the Antarctic’s worst weather in years. Symptoms of scurvy, brought on by several months on a diet devoid of fresh food and vitamin C, may have greatly hampered their efforts. Only eleven miles from a food and fuel depot they found themselves too exhausted to continue. For Commander Scott and his companions, just a few milligrams of ascorbic acid might have changed their world.
Had the value of ascorbic acid been recognized earlier, the world today might be a very different place. With a healthy crew Magellan might not have bothered to stop in the Philippines. He could have gone on to corner the Spice Islands clove market for Spain, sail triumphantly upriver to Seville, and enjoy the honors due to the first circumnavigator of the globe. A Spanish monopoly of the clove and nutmeg markets might have thwarted the establishment of the Dutch East India Company—and changed modern-day Indonesia. If the Portuguese, the first European explorers to venture these long distances, had understood the secret of ascorbic acid, they might have explored the Pacific Ocean centuries before James Cook. Portuguese might now be the language spoken in Fiji and Hawaii, which might have joined Brazil as colonies in a far-flung Portuguese Empire. Maybe the great Dutch navigator Abel Janszoon Tasman, with the knowledge of how to prevent scurvy on his voyages of 1642 and 1644, would have landed on and formally laid claim to the lands known as New Holland (Australia) and Staten Land (New Zealand). The British, coming later to the South Pacific, would have been left with a much smaller empire and far less influence in the world, even to this day. Such speculation leads us to conclude that ascorbic acid deserves a prominent place in the history—and geography—of the world.
3. GLUCOSE
THE NURSERY RHYME phrase “Sugar and spice and everything nice” pairs sugar with spices—a classic culinary matching that we appreciate in such treats as apple pie and gingerbread cookies. Like spice, sugar was once a luxury affordable only by the rich, used as a flavoring in sauces for meat and fish dishes that today we would consider savory rather than sweet. And like spice molecules, the sugar molecule affected the destiny of countries and continents as it ushered in the Industrial Revolution, changing commerce and cultures around the world.
Glucose is a major component of sucrose, the substance we mean when we refer to sugar. Sugar has names specific to its source, such as cane sugar, beet sugar, and corn sugar. It also comes in a number of variations: brown sugar, white sugar, berry sugar, castor sugar, raw sugar, demerara sugar. The glucose molecule, present in all these kinds of sugar, is fairly small. It has just six carbon, six oxygen, and twelve hydrogen atoms, altogether the same number of atoms as in the molecules responsible for the tastes of nutmeg and cloves. But just as in those spice molecules, it is the spatial arrangements of the atoms of the glucose molecule (and other sugars) that result in a taste—a sweet taste.
Sugar can be extracted from many plants; in tropical regions it is usually obtained from sugarcane and in temperate regions from sugar beets. Sugarcane (Saccharum officinarum) is variously described as originating in the South Pacific or southern India. Sugarcane cultivation spread through Asia and to the Middle East, eventually reaching northern Africa and Spain. Crystalline sugar extracted from cane reached Europe with the first of the returning Crusaders during the thirteenth century. For the next three centuries it remained an exotic commodity, treated in much the same way as spices: the center for the sugar trade developed initially in Venice along with the burgeoning spice trade. Sugar was used in medicine to disguise the often-nauseating taste of other ingredients, to act as a binding agent for drugs, and as a medicine in itself.
By the fifteenth century sugar was more readily available in Europe, but it was still expensive. An increase in the demand for sugar and lower prices coincided with a decrease in the supply of honey, which had previously been the sweetening agent in Europe and much of the rest of the world. By the sixteenth century sugar was rapidly becoming the sweetener of choice for the masses. It became even more popular with the seventeenth- and eighteenth-century discoveries of the preservation of fruit by sugar and the making of jams, jellies, and marmalades. In England in 1700 the estimated yearly per capita consumption of sugar was about four pounds. By 1780 this had risen to twelve pounds and, in the 1790s, to sixteen pounds, much of it probably consumed in the newly popular drinks of tea, coffee, and chocolate. Sugar was also being used in sweet treats: syrup-covered nuts and seeds, marzipan, cakes, and candies. It had become a staple food, a necessity rather than a luxury, and consumption continued to rise through the twentieth century.
Between 1900 and 1964 world sugar production increased by 700 percent, and many developed countries reached a per capita annual consumption of one hundred pounds. This figure has dropped somewhat in recent years with the increasing use of artificial sweeteners and concerns over high-calorie diets.
SLAVERY AND SUGAR CULTIVATION
Without the demand for sugar, our world today would probably be a lot different. For it was sugar that fueled the slave trade, bringing millions of black Africans to the New World, and it was profit from the sugar trade that by the beginning of the eighteenth century helped spur economic growth in Europe. Early explorers of the New World brought back reports of tropical lands that were ideal for the cultivation of sugar. It took little time for Europeans, eager to overcome the sugar monopoly of the Middle East, to start growing sugar in Brazil and then in the West Indies. Sugarcane cultivation is labor intensive, and two possible sources of workers—native popu
lations of the New World (already decimated by newly introduced diseases such as smallpox, measles, and malaria) and indentured servants from Europe—could not supply even a fraction of the needed workforce. The colonists of the New World looked toward Africa.
Until this time the slave trade from western Africa was mainly limited to the domestic markets of Portugal and Spain, an outgrowth of the trans-Saharan trade of the Moorish people around the Mediterranean. But the need for workers in the New World drastically increased what had been to that point a minor practice. The prospect of deriving great wealth from sugar cultivation was enough for England, France, Holland, Prussia, Denmark, and Sweden (and eventually Brazil and the United States) to become part of a massive system of transporting millions of Africans from their homes. Sugar was not the only commodity that relied on slave labor, but it was probably the major one. According to some estimates, around two-thirds of African slaves in the New World labored on sugar plantations.
The first slave-grown sugar from the West Indies was shipped to Europe in 1515, just twenty-two years after Christopher Columbus had, on his second voyage, introduced sugarcane to the island of Hispaniola. By the middle of the sixteenth century Spanish and Portuguese settlements in Brazil, Mexico, and many Caribbean islands were producing sugar. The annual slave shipment from Africa to these plantations numbered around ten thousand. Then in the seventeenth century the English, French, and Dutch colonies in the West Indies began growing sugarcane. The rapidly expanding demand for sugar, the growing technology of sugar processing, and the development of a new alcoholic drink, rum, from the by-products of sugar refining contributed to an explosive rise in the number of people dispatched from Africa to work the sugarcane fields.