Zapped

Home > Science > Zapped > Page 6
Zapped Page 6

by Bob Berman


  Thanks to modern architecture, it was easy for people in the 1960s to do just that. Increasingly, buildings were constructed with windows that could not be opened—and glass effectively blocks the sun’s invisible ultraviolet emissions. When out of the office, people rode in cars, where the availability of air-conditioning starting in the early 1960s encouraged closed-window driving. Then in the 1980s a new product appeared on the market: sunblock, which essentially stops the body’s vitamin D production cold. In their early years, such lotions—with numbers like SPF 30 and SPF 45 and names like Coppertone, a product that was introduced in the early 1950s—had been primarily marketed as tan-promoting products. But with the advent of the late twentieth century, sunblock was sold as a skin-cancer preventive, and people were advised to cover themselves with it throughout the summer months. Even the medical establishment urged hiding from the sun as a way to avoid skin cancer.

  At the same time, a series of major developments altered children’s exposure to ultraviolet rays as well as to visible solar rays. Until the 1970s, kids were encouraged to play outdoors after school. They’d hit the parks, ball fields, and playgrounds and materialize at home just before dinnertime. All that changed with, first, the widespread fear of crime, especially sexual predation. Next came the video-game craze in the 1980s, which meant that kids were opting to shut themselves in their rooms rather than climb trees. In the 2000s, texting and Internet surfing came along to increase indoor time for kids even more. By the early years of the twenty-first century, the metamorphosis was complete. Humans in Western countries had transformed themselves into a race of mole people. Avoidance of the sun was almost total. At the same time, levels of vitamin D in blood plunged to virtually zero. Nowadays, many vitamin D researchers and expert groups say that a blood level of at least 30 ng/mL (nanograms per milliliter) is optimal; some advise even higher goals—40 or 50 ng/mL.

  In the midst of the great indoor migration, we apparently forgot something humans have known for millennia. The ancient Greeks, who worshipped the sun god Helios, seem to have been the first to write about the importance of sunlight in human health. They of course didn’t know that it’s the UV component of sunlight that produces its most salutary effects, but nonetheless the benefits of sunbathing appeared in the writings of Herodotus (fifth century BCE), Cicero (first century BCE), the architect Vitruvius (first century BCE), Pliny the Elder (23–79 CE), the famed Roman physician Galen (130–200 CE), the Greek surgeon Antyllus (second century CE), and others.

  After the fall of the Roman Empire, the practice of sunbathing apparently fell into disfavor. But it emerged again during the early Middle Ages, as documented by the Persian scholar and physician Avicenna (980–1037 CE). Sunbathing for medical and cosmetic purposes has continued ever since. Cultures around the globe maintain a belief in the healing power of sunlight. Or at least we did until the 1980s, when skin cancer worries abruptly changed the picture.

  It may be true that excessive sun exposure increases risk of skin cancer. But paradoxically, it also seems that sunlight can lower our cancer risk. The late Dr. Robert Heaney of Creighton University, in Omaha, who treated thousands of patients with vitamin D and was a member of the nonprofit Vitamin D Council, pointed out that vitamin D was found to be beneficial in thirty-two randomized trials. In one big study of women whose average age was sixty-two, a large daily vitamin D supplement produced a whopping 60 percent reduction in all kinds of cancers after just four years, compared to a control group.

  Heaney was not alone in believing that vitamin D prevents tiny, predetectable cancers from growing and spreading. “That’s the kind of cancer I’d want to have—one that never grows,” he told me.

  In any discussion of the natural rays, visible and invisible, in which our bodies evolved, obtaining sufficient UV-provoked vitamin D3 makes much sense medically. (There are five chemically distinct types of vitamin D. The one the body creates when struck by ultraviolet light is also the kind that, when taken as a supplement, is most associated with reduced mortality, especially in the elderly. This is vitamin D3.) After all, vitamin D is automatically created in our bodies when our skin is struck by the sun’s ultraviolet rays. Why would our bodies rapidly create it if it weren’t important to have in our blood? Ten minutes in strong sunlight stimulates your body to create as much vitamin D as you’d get from drinking two hundred glasses of milk. The natural conclusion we can draw from this is that the human body needs to have a high and steady level of this vitamin in circulation. Yet if we hide from the sun, we’ll suffer a deficiency.

  As for the fear of skin cancer, an unexpected revelation comes from Dr. Stephanie Seneff, a senior scientist at MIT who has been conducting research on the relationship between nutrition and health for decades. “Both cholesterol and sulfur afford protection in the skin from radiation damage to the cell’s DNA, the kind of damage that can lead to skin cancer,” Dr. Seneff told me. “Cholesterol and sulfur become oxidized upon exposure to the high-frequency rays in sunlight, thus acting as antioxidants to ‘take the heat,’ so to speak. Oxidation of cholesterol is also the first step in the process by which cholesterol transforms itself into vitamin D3.”

  According to Dr. Seneff, our bodies contain the necessary mechanisms to extract or produce beneficial nutrients from the sun while also shielding us from harm. “Circumventing this natural process, either by using sunblock or staying out of the sun entirely, makes us lose all the health benefits and gives a variety of disease processes free rein,” says Dr. Seneff.

  Dr. John Cannell, of the Vitamin D Council, summed it up for me in a few words: “Everyone should get as much sunlight as they can, without burning.”

  In addition to its cancer-preventing value, sunlight plays a critical role in preventing or curing rickets, a terrible disease that mostly affects children. Rickets is caused by the absence of vitamin D, which in turn causes dietary calcium to be inadequately absorbed, resulting in skeletal deformities and neuromuscular symptoms such as hyperexcitability.

  Sunlight’s ultraviolet rays are also associated with mitigating depression. These days we use the term SAD to denote seasonal affective disorder, a severe depression linked to the reduction in sunlight we experience during the winter. It afflicts 15 percent of the population, and it begins and ends at around the same time every year. Says the Mayo Clinic on its website: “If you’re like most people with SAD, your symptoms start in the fall and continue into the winter months, sapping your energy and making you feel moody.” The primary treatment for SAD is light therapy, or phototherapy—deliberate daily or near-daily exposure to natural or artificial light. The most commonly used phototherapy product is a “light box,” but before you step into one, talk the matter over with your physician. If you suffer from both SAD and bipolar disorder, for example, then a too-quick increase in light exposure could induce manic symptoms.

  Now that you know about the health benefits of both the visible and invisible components of sunlight, let’s explore its opposite: darkness. For our ancestors, nightfall ushered in a period of total darkness, illuminated only by moon, stars, and fire. Now many of us spend the night awash in the glow of computer screens, alarm clocks, and streetlights. But just as your body’s need for adequate sunlight is important, some studies suggest that our bodies also need to be shielded from the electromagnetic spectrum, including rays emitted by indoor light.

  Here we come to an astonishing medical revelation: the single surest cause of breast cancer is—a lack of darkness!

  According to the Breast Cancer Fund, “Effects of night-shift work on breast cancer risk are greatest for women who work rotating hours that include the overnight (as opposed to evening) shift and for those who work twelve-hour shifts that frequently switch between day and night work.… Risk of developing breast cancer increased for women who worked night shifts for more than four and a half to five years, especially those who regularly engaged in night work for at least four years prior to their first pregnancy, [i.e.,] before their mammary
systems had fully differentiated.”

  These results are concerning, the fund explains, because around 15 percent of the US workforce currently works at least some of the time on non-day shifts. The report continues: “The most thoroughly studied mechanism to explain the effects of night-shift work is called the light-at-night (LAN) hypothesis. Increasing exposure to light, especially bright indoor lights, at times outside of normal daylight hours, decreases secretion of melatonin,” which can increase the risk of breast cancer. “In support of the light-at-night hypothesis, blind women who are completely unable to perceive the presence of environmental light, and therefore have no daily decreases in melatonin levels, have a statistically significant lower risk of diagnosis of breast cancer than do blind women who do perceive light and have regular decreases in melatonin secretion over the normal twenty-four-hour cycle. The former effect (no daily melatonin decreases) and its opposite in night-shift workers (no daily melatonin increases) both support the conclusion that the greater the secretion of melatonin, the lower the risk of breast cancer.”

  This all suggests that human health not only requires visible sunlight but also very much demands the invisible portion of sunlight—ultraviolet light. And, perhaps almost as important, the body must be in tune with sunlight’s natural cycle, which means that we must also regularly experience its absence.

  The takeaway here is that artificial indoor light contradicts nature’s alternating dark-light rhythm. Artificial light may be fine for extending our hours of daily productivity, but it shouldn’t be left permanently on during our sleep cycle. In other words, we need darkness. What remains unclear is the quantity: how much dim light is sufficient to create a medical problem? Is it the amount given off by a night-light? A clock’s LED digits? A streetlight shining behind closed curtains? Passing car headlights? How much is too much? At the time of this writing there’s no clear-cut answer.

  Ultraviolet light from the sun stimulates the human body to create vitamin D, one of the most potent anticancer agents known. Research suggests that people should not routinely use sunscreen to block ultraviolet rays but actually expose themselves to as much sunlight as they can without burning. (Matt Francis, Prescott Observatory)

  Besides its health benefits, UV light helps us see things that are invisible to us in regular light. The reason we can’t see UV light is because its waves are absorbed by the lens of the eye and don’t reach the retina. Good thing, too, because if they did reach the retina, they would damage the cone-shaped receptor cells responsible for color vision. But some animals, including reptiles, birds, and many insects, including butterflies and bees, can see UV rays. This isn’t accidental. Many seeds, fruits, and flowers “pop” in UV light and are easier to find against a busy background. Famously, scorpions glow brilliantly under UV illumination. Some birds even display plumage designs that are invisible in ordinary light. Moreover, the semen and urine of many animals, including humans, will glow in UV light. Health inspectors use a source of UV light along with detection equipment to spot unclean and improperly washed hotel rooms. That’s because bodily fluids fluoresce: their complex chemicals absorb UV light and emit it as visible light.

  Suburbanites commonly use UV light fixtures to attract flying pests that “see” its waves, luring them to “zappers” or traps. These devices emit the kind of UV light that has the longest waves—UVA, just barely unseen by the human eye—because flies are most attracted to light at 3,650 angstroms—right in the middle of UVA’s range.

  In addition to killing bugs, ultraviolet light helps us understand the cosmos. The ultraviolet universe has a very different appearance from familiar galaxies and stars, which mostly glow in visible and infrared light. That’s because most stars are cooler than our sun, and even our sun only emits 10 percent of its light as UV rays. Indeed, 95 percent of all stars shine overwhelmingly in the infrared and visible parts of the spectrum, and the majority emit more infrared and less visible light than our own beloved sun does.

  Ultraviolet radiation—detection of which is performed by orbiting spacecraft above our UV-blocking atmosphere—is always the telltale fingerprint of unusually intense heat. Superhot stars, which appear blue to the eye, are either anomalously massive or are being viewed in the early or very late stages of their evolution, when their electromagnetic emissions are unusually intense. Such unusually high heat does more than crank up the emissions across their entire spectrum—it also shifts their emission curve to the shorter, more energetic end of the spectrum, making blue light and especially ultraviolet light more plentiful.

  Instruments that detect ultraviolet light in the heavens do not show most stars at all. But they do show the violence of merging galaxies and the extreme temperatures of the giant blue stars that even in visible light often serve as the lighthouses of the cosmos. In short, UV telescopes offer a portrait not of average, middle-mass, ordinary stars like our own sun but rather of the exceptional places where violence is routine. Through UV “eyes,” it’s always the apocalypse.

  The awakening of science to this new way of seeing the cosmos began with Johann Ritter in 1801. By 1815, scientists found that Ritter’s “chemical rays” darkened not just silver chloride but also many other kinds of metallic salts. Between 1826 and 1837, Nicéphore Niépce, credited with taking the first successful photograph, in 1827, and Louis Daguerre, the most famous photographic innovator of his day, found that silver iodide was especially light sensitive, and they used this discovery as the basis for their early work, which even then had begun to gain international notice. By 1842, others found that when sunlight hit a gelatin emulsion containing silver iodide, soon to be called a daguerreotype plate, it induced a photochemical reaction. Practical photography was born.

  During the remainder of the nineteenth century, physicists kept making important theoretical and empirical discoveries that clarified the nature and properties of UV rays, although they were still called chemical rays until the 1870s. The development of very bright artificial lighting such as carbon arcs—dazzling spotlight-type fixtures that emit light by causing high-voltage electrical charges to leap across a short gap—provided the world with light sources that gave off copious UV radiation. The most important breakthrough came in 1859, when Gustav Kirchhoff and Robert Bunsen invented the spectroscope, which shows the composition of light by splitting it into its constituent wavelengths. After that, scientists could identify any light-emitting substance simply by noting the patterns of colors in its light. As they watched a burning building in the distance, for example, their spectroscope told them that the pipes contained lead. Suddenly the fields of physics, astronomy, and chemistry changed forever. The spectroscope enabled researchers to determine the composition of anything—even a star—merely by observing its emitted light.

  Years later, scientists learned that the sun’s light—visible and invisible—is merely the by-product of a process alchemists had vainly tried to reproduce for centuries—the transmutation of one element into another. That nature accomplishes this before our very eyes, and that it is what creates the solar heat and light that supports all life, was suspected by no one. The revelation came as a complete surprise.

  Let’s temporarily leave our timeline of invisible-light discoveries to explore this astounding central talent of the sun.

  CHAPTER 8

  The Exploding Sun

  In Woodstock, New York, aging ex-hippies of my acquaintance still say things like, “It’s all energy!” They’re right. While old-time physicists used to speak of frictional energy, chemical energy, mechanical energy, electrical energy, kinetic energy, potential energy, and lots of other varieties, science has come to see that, almost mystically, “it’s all one!”

  Meaning, first of all, that the universe was born with all the energy it will ever have. And what it has never diminishes. This seems counterintuitive, because we do use up the fuel in our cars—and watch some of it get wasted as heat that goes out through the tailpipe or as heat that the rubber tires leave on th
e road. Energy seems to diminish. But in reality, it merely changes form.

  Let’s look at the forms of energy that used to seem so distinct and various. Say you slam on the brakes in your car, an action that requires mechanical energy and that uses up chemical energy in the form of gasoline. Your car then slows down because frictional energy on the asphalt and the brake pads causes the tires to turn more slowly. All sorts of different energies seem to be required simply because the idiot in front of you screeched to a halt the instant the traffic light turned yellow.

  But look more closely. The mechanical energy involved moving parts, as did the frictional energy. Friction converted the mechanical energy to heat—and what is heat? It’s merely the motion of atoms. That’s all there is to it—the simple movement of atoms and molecules. So really what happened was that a macroscopic (visible) display of energy—tires slowing—was changed to a submicroscopic form, as umpteen atoms were sped up. In total no energy was lost. It just changed form. And it involved nothing more than motion of various kinds. So it turns out that all energy is motion.

  Other examples? Well, if we start with thermal energy, or heat energy, all superhot objects routinely convert their heat into visible electromagnetic radiation (light). Or you can use a thermocouple to convert thermal energy into electrical energy. Or employ a steam turbine to convert thermal energy into mechanical energy.

  Or you could go the opposite way and start with mechanical energy, such as the kind found in an engine. You could convert it into a different form of mechanical energy via gears or levers. Or you could convert it into nuclear energy via a synchrotron or particle accelerator. Or into thermal energy by hitting your brakes. Or into electrical energy by using a generator. Or into chemical energy by striking a match.

 

‹ Prev