Deep Future

Home > Other > Deep Future > Page 12
Deep Future Page 12

by Curt Stager


  Sooner or later, the new 14C atom loses control of its unstable core. The atom flicks out a tiny nuclear beta particle and reverts back to plain old nitrogen again. That’s what radioactivity is all about: the forceful ejection of debris from overweight atomic nuclei.

  In some cases, the flying chunks that radioactivity sprays hither and yon are energetic enough to injure living things. Radium, for instance, emits a continuous volley of potentially dangerous subatomic buckshot. We are well advised to stand clear of such powerful nuclear radiation sources because their invisible ejecta can plow through our cells and cause burns, radiation sickness, or cancer. But carbon-14 is merely a pop-gun by comparison, and under most circumstances its shrapnel is unlikely to damage your genes or tissues. The only way it can hurt you is if you inhale or swallow it so it can move in close to your cells by way of your bloodstream. Of course, we do exactly that every time we breathe, eat, or drink. In other words, it’s not the 14C around us that poses a threat; it’s the 14C within us.

  In the case of 14C, we face several possible health hazards. It lies deeply embedded in our carbon-rich body tissues, so when any given bit of it explodes the recoil or the flying pellet may damage an adjacent molecule or cellular feature of importance. But there’s yet another threat as well. Carbon is a key component of our genes, so some of the ladderlike chains of DNA that they consist of contain radioactive time bombs. When one of them goes off, the precisely organized structure of the gene changes in ways that may alter the health or behavior of its host cell. And if one or more of the supporting structural proteins that bind to it fly apart, it could change how a gene folds or unfolds as it is being stored or opened for use. On a really bad-luck day, such mutations could cause a birth defect or a deadly tumor.

  The risk is real, though blessedly small because of our built-in cellular repair mechanisms, and until recently it was thought to be an unavoidable fact of life. But no longer. The boundlessly innovative energy of the marketplace has spawned a way to make money from this ancient bane of human health. Why settle for organic, pesticide-free produce when you can buy it radiocarbon-free as well? All that is needed is to raise vegetables in a tightly sealed chamber that allows the composition of the air inside to be controlled. The exact details of the method are a trade secret because, according to one patent application that I found online, “they’re so simple and obvious,” but they seem to have something to do with burning coal, oil, or gas and running the CO2-charged exhaust into the greenhouse chambers. Fossil fuels are devoid of 14C, so when crop plants inhale those vapors they grow up free of radioactivity, probably the first organisms ever to do so on this planet.

  Most 14C atoms form rather quietly in the upper atmosphere, but they don’t go unnoticed there for long. Oxygen atoms, the second most common component of air, quickly locate solo carbon atoms and stick to them. Within hours of their birth, freshly made 14C atoms and their oxygen hitchhikers drift away on the wind as carbon dioxide. Eventually, those radioactive CO molecules mix downward into the lower reaches of the atmosphere, where they can be sucked into the cells of photosynthetic bacteria, algae, and plants. There they blend in with the normal carbons until, sooner or later, they revert back to their original nitrogenous selves in a burst of energy and flying particles.

  We, in turn, eat radioactive animals and plants and use the nutritious components of their tissues to build our own bodies. We thereby divert some of the world’s biological carbon flow into our bodies with each meal and release some with each exhalation, excretion, secretion, exfoliation, and birth. But when we die our bodies are cut off from the stream and, from that moment on, we stop replacing the 14C atoms that break down inside us. After 5,740 years, about half of our 14C load will have vanished. As another 5,740 years pass, half of those remaining atoms break down, and so on until all of them are gone or they’re too rare to measure accurately. Normally, that near-total decline takes about 50,000 years.

  Much to the delight of scientists, the average rate of 14C decay is so reliably stable that we can use it as a molecular clock to date once-living things. To do that, we need only measure how much of it remains intact within an object. If about half of the expected amount of 14C is present, then the substance is probably close to 5,740 years old. If only a quarter remains, then it’s twice as old, and so on. The less 14C you find in something, the older it’s supposed to be.

  Unfortunately, the opening centuries of the Anthropocene have now thrown that system into disarray. The fossil fuels that we burn are millions of years old, so the unstable 14C atoms that were originally present in them broke down long ago. The CO2 that forms when coal, oil, and natural gas are burned is a “dead” gas, unlike the stuff that goes up our chimneys when we burn recently cut 14C-laden wood, so by recycling tons of stable fossil carbon back into circulation, we’ve reduced the natural radioactivity of the atmosphere. At last, a bright side to air pollution; this kind slightly dilutes the radiocarbon load in the air and in our bodies.

  If future historians use radiocarbon dating on objects from these early stages of the Anthropocene, they’ll face a serious problem. All of the bone, hair, wood, aquatic mud, and other such materials that formed between the late 1800s and mid-1900s were less radioactive than usual because they were diluted by stable fossil 12C in the air. When the maxim of “less 14C means older” is applied to such samples, they yield ages that appear to be artificially old.

  According to Darden Hood, of the Beta Analytic dating lab in Miami, the Suess effect on radioactive carbon first became clearly noticeable in the 1890s. “If you analyze tree rings that were laid down between the late 1800s and the 1940s,” he explained to me recently, “you find about three percent less carbon-14 in them than you’d otherwise expect.”

  To put it another way, imagine radiocarbon dating the remains of a U.S. infantryman who died in World War I. Fossil fuel carbon was already contaminating the air, oceans, and human bodies of his day enough to change their apparent ages. The time offset in the soldier’s bones would therefore suggest that he died 200 to 300 years earlier, well before the nation that he fought for was born. Although nobody could have known it then (14C wasn’t discovered until 1940), everyone who lived during the first few decades of the twentieth century was a living fossil, thanks to global carbon pollution.

  But things became even more complicated after we invented yet another kind of carbon pollution during the closing stages of World War II. The United States, the former Soviet Union, and other nations began to test powerful thermonuclear devices high up in the atmosphere. Those detonations were, in a sense, like artificial cosmic ray bursts that created radioactive 14C by smashing nitrogen atoms in the air. Hundreds of nukes exploded in aboveground tests during the 1950s and 1960s before a partial ban outlawed them, collectively producing so much 14C that it temporarily overwhelmed the Suess effect from our fossil fuel emissions. By 1963, at the height of that grim fireworks display, atmospheric 14C concentrations worldwide had nearly doubled.

  As a result of that global nuclear contamination, all organisms that have sprouted, crept, flown, or swum since the 1950s have been artificially enriched in radioactive bomb carbon. It travels with you wherever you go, in the skin of your hands, in the pages of this book, in the snuffly wet nose that your dog is pressing against you in hopes of winning a radioactive snack treat. Published estimates of the resultant damage to human health over the subsequent decades are difficult to confirm, but they range from hundreds of thousands to millions of cancers and birth defects.

  Not all of the effects of bomb carbon pollution have been bad, though. Forensic investigators have learned to use it in amazingly creative ways.

  Imagine that you have just invested a small fortune in a case of fine wine that is supposed to have been vinted in 1910. Why not check the radiocarbon content of the alcohol, just to be sure? If it contains excessive amounts of 14C, then the wine came from grapes that ripened during or after the bomb-happy 1950s. Too bad for you.

  Or
perhaps you have bought an ivory sculpture that is supposed to have been made from legal tusks that were harvested before an international ivory ban was enacted. If you want to be sure that you’re not supporting the elephant poaching trade, you could check to see if the amount of 14C in that sculpture matches the atmospheric concentration of bomb carbon from the supposed harvest year.

  Even wildlife biologists have been turning the radiocarbon lemon into lemonade. At Isle Royale National Park, a wooded island in Lake Superior, the concentrations of bomb carbon in the teeth of moose have been used to determine when the animals were born.

  And what has all that bomb carbon done to radiocarbon dating? By boosting radioactivity levels everywhere, it has completely reset the planet’s radiocarbon clock. Early in the twentieth century, the diluting Suess effect made everything seem to be older than it really was, but nowadays every living thing on Earth has more 14C in it than it otherwise would. When you now apply the traditional formula to modern objects, you don’t get artificially ancient radiocarbon ages any more, but you don’t get accurate “zero” ages, either. Bomb carbon has pushed the dial on the theoretical isotopic clock so far forward that it runs us right through the present moment and into the future. You may be living in 2011 but most of your body—or, at least, its apparent age—lies many centuries ahead of you on the radiocarbon time line.

  According to Darden Hood’s calculations, I was seemingly born in 5300 AD. In reality, I entered this world in 1956, but the food molecules that Mom inadvertently passed along to me through my fetal umbilical cord were infested with 14C atoms that formed in mushroom clouds, probably somewhere over the Pacific. The bomb carbon that flooded the food webs of 1956 made me so radioactive that my newborn body lay three virtual millennia ahead of me in the future.

  But bomb pollution is fading fast now. It’s not that it breaks down very quickly; that process will take thousands of years to put a noticeable dent in the reservoir. It’s because carbon-bearing minerals and the bodies of aquatic organisms are falling into layered tombs on the sea floor year after year, a process that locks their radioactive carbons away with them along with a large fraction of our fossil carbon emissions. The decrease is exponential, and most scientists expect the Suess effect to regain dominance over bomb carbon within another decade or two.

  Because the bomb carbon effect is growing weaker and weaker, it is also driving our apparent radiocarbon ages less and less deeply into the future. Not only do today’s newborns contain far less 14C than I did in 1956; the food we eat brings us ever lower doses of the stuff, too. This should also make our present isotopic age offsets less extreme than the ones we were born with.

  I asked Hood to crunch the numbers for me. After a brief pause punctuated by the clicks of a calculator, he answered. “Most people living today still have a fair bit of bomb carbon in their bodies. If you eat the same kind of breakfast cereal that I do each morning, then I would expect your radiocarbon age now to be something like 580 years. In the future.”

  Somewhat confusing, perhaps, to be living backward through time like Merlin the Arthurian wizard. It’s as though I started my life in 5300 AD and ended up in 2589 AD by late middle age. But even this bizarre aspect of the bomb carbon saga has a decidedly uplifting side to it. Biomedical researchers are using the known annual decline of bomb carbon concentrations as a tool to help them answer important and long-standing questions about human bodies and health.

  Do our brain cells form only when we’re young? If so, then the effects of “sex, drugs, and rock and roll” on our neuron supplies might be more persistent than we would like. Are fat cells permanent fixtures in our hips and bellies? If so, then diets are only temporary holding actions against obesity. And what about that tumor we’ve just spotted; did it balloon up recently or is it simply a slow-growing lump?

  Bomb carbon analysis can help to answer all of these questions. Different parts of our bodies contain different amounts of 14C, not because they reject or absorb it differently but because they formed at different times in our lives. Matching 14C contents to specific years has been used to bring us both good news and bad; the visual and memory neurons that we lose to fun and games are not renewed, sad to say, but our body-fat cells are replaced every eight years or so, and the central cores of many tumors are potentially datable enough to help with the design of cancer treatments.

  The time-warping effects of bomb carbon will only influence objects that formed between the 1950s and, say, 2020 AD. But those effects are strange indeed, and future scientists will have to deal with them if they try to probe the artifacts of our times with radiocarbon dating. Of course, this is among the least of our worries as we consider the consequences of Anthropocene carbon pollution; the disruption of radiocarbon ages, per se, isn’t going to melt any ice or drive any species to extinction, and the dilution of 14C with inert fossil carbon is, if anything, a health benefit.

  But what if we look at it another way? Most of us hope to leave some sort of positive mark on the world after we’re gone, some sign that we’ve been here. It’s not much to ask for, even if the urge may be rooted in ego gratification or a deep-seated fear of mortality. However, by ruining the radiocarbon age labels that will accompany the remains of our bodies and our civilizations down through the ages, we have utterly scrambled the chronological filing system of history.

  Future scientists who hunt for the artifacts that we’ve produced during our lives won’t be able to radiocarbon-date them easily because items from the first half of the twentieth century will seem to have come from earlier times, and those from the second half of the twentieth century and the early decades of this one will seem to have come from later times. In fact, there won’t be any objects radiocarbon-datable to our times at all, at least not correctly so. When the Suess effect takes over again after the fading out of bomb carbon, it will once again make objects seem artificially old throughout the remainder of the Anthropocene carbon tail-off. At some point, certain objects may appear to have come from our own century because their radioactivity levels seem to indicate it, but they’ll merely be time-traveling imposters in museum collections of the future.

  Our current time frame simply won’t exist in literal readings of the geologic record. From a future historian’s perspective, the tale of our present world, in a radiometric sense, will be a missing chapter that was torn completely away from the book of history.

  6

  Oceans of Acid

  When beholding the tranquil beauty and brilliancy

  of the ocean’s skin, one forgets the tiger heart

  that pants beneath it; and would not willingly

  remember that this velvet paw but conceals a

  remorseless fang.

  —Herman Melville, Moby-Dick

  Like most people, I have tended to dwell on weather, ice sheets, and sea levels when I’ve thought about global warming. But there’s yet another aspect of the issue that has gotten less attention in the public realm, one that involves chemical changes that are far more damaging to living things than carbon isotope imbalances. Ocean acidification is one of the most ecologically important aspects of today’s carbon crisis, but it is rarely discussed while we fret about things that are more plainly open to view. The climatic shifts caused by our artificial greenhouse gas buildups may arguably provide benefits in some situations as well as harm in others, and in the end they are reversible over time. But there is no obvious bright side to the irreversible, acid-driven extinction of species. As a direct threat to the aquatic life that swims, crawls, or sits beneath our planet’s mostly blue surface, ocean acidification decisively tips the scales of ethical judgment against allowing our fossil fuel emissions to continue unabated.

  How can oceans acidify? They seem too huge to be harmed by something as seemingly benign as air. They are also shielded from most major chemical disturbances by a defensive line of substances known to chemists as the “carbonate-bicarbonate buffering system.” But although the buffering system offers
some protection, it does so only up to a point. Under a sustained pollutant assault, the chemical fortress of the sea can indeed be overwhelmed.

  Here’s the problem: CO2 dissolves in water. That’s no great surprise if we remember that fish breathe waterborne oxygen through their gills and about a quarter of the excess CO2 that we emit each year diffuses into the oceans. From a purely climate-centered perspective the marine uptake of our excess CO2 resembles a helpful alliance, as if the oceans were on our side in the struggle against global warming. After all, this is how most of our carbon emissions will eventually be scrubbed from the atmosphere. But CO2 doesn’t just disappear when it enters a water body like the ocean. It morphs into carbonic acid.

  The chemistry involved in this process is rather involved, but the most important concepts to grasp in this context are fairly simple. All of the main molecular characters that you need to know about in this context include carbon in their physical structures as well as in their names, but some are predators while others are prey. The top predator is carbonic acid, and its primary prey are base molecules called “carbonates,” particularly the mineralized forms of calcium carbonate. Spill battery acid onto a patch of concrete pavement—cement is often refined from marine carbonate deposits—and the acid molecules will savage their target in a sizzling burst of foam that leaves behind a sullen puddle of neutralized waste salts. That kind of reaction saves lives when it drives the frothing spray of a fire extinguisher, but it snuffs them out when it attacks the shells of living sea creatures.

  Carbonic acid releases tiny, positively charged hydrogen ions into the surrounding water. Their positive charge reflects the temporary loss of a negative electron from each hydrogen atom, a natural result of dissolving in water. If this concept is new to you, then it might help to think of molecular bathers whose swimsuits have been removed: imagine the naked particles drifting about in a state of nervous eagerness to replace their protective electron coverings. Any molecular passerby wearing potentially removable electrons, as do carbonates and their bicarbonate cousins, is at risk of a destructive chemical stripping.

 

‹ Prev