Immortality, Inc.

Home > Other > Immortality, Inc. > Page 3
Immortality, Inc. Page 3

by Chip Walter


  Hixon is Alcor’s constant gardener. He trolls the arboretum’s frozen canisters, regularly topping them off with liquid nitrogen to ensure the rigid bodies and brains of those within remain neither dead nor alive, but securely somewhere in between. “Time is our enemy,” Hixon says. “I can’t stop it, but I sure as hell can slow it down.”

  The Patient Care Bay is big, a good-size warehouse with high, corrugated walls, exposed beams, and smooth cement floors. Alcor’s 18 canisters are 11 feet tall, stainless steel thermoses, slim, silvery, and upright. A doorway sits above the chamber like a large sunroof. Usually it’s closed, but with the click of a button, Hixon can open it to the heavenly blue Arizona sky. This allows a crane, and its 2,000 pounds of chain, to reach below and carefully haul up and then deposit Alcor’s patients into their appointed canister: a necessary piece of throwback machinery in the service of an unknown and mysterious future. One canister can house six full bodies, or 10 neuros, or some combination of both. It’s tight quarters, but in the vitrified world, people don’t seem to much complain about the crowding.

  In 2016, of the 149 patients housed at Alcor, 37 were women; the other 112 were men. Another 1,116 had already signed on to join the current tenants at some future date, including people like Ray Kurzweil, PayPal co-founder Peter Thiel, biogerontologist Aubrey de Grey, nanotechnology pioneer Eric Drexler, and, of course, Ralph Merkle. The average age of the currently vitrified is 65.

  Hixon is also Alcor’s chief depositor. When a new patient arrives, he is the man who sees them to their next, but hopefully not final, resting place. Their bodies are swaddled in a chilled sleeping bag and slipped into a metal pod of Hixon’s design. (Neuros are packed around with Dacron wool, and then placed in a “neuro can,” a kind of metal helmet filled with liquid nitrogen.) Everything is painstakingly labeled and logged. (One wouldn’t want to be misidentified when one awakens to a new future.)

  After days of slow cooling, the patient is finally ready to head home. Hixon pushes the appointed thermos into position on the upright wheels that carry it. Then he shuffles back to the end of the long warehouse, where the other silvery canisters await at silent attention. There, he hits the white button, and a ritual unfolds. The gears crunch and clank. The chamber’s roof opens again to the Arizona sky. Hixon cranks the chains down to the empodded cryonauts and attaches both ends before the machine heaves them horizontally, like Frankenstein’s monster in the 1931 Boris Karloff movie. Then, one end of the pod is disconnected, leaving the patient dangling upside down before the rigid body or brain at last slips into its cryonic resting place. Once again Hixon hits the big white button, and the clanking eye of the roof disappears. There is a distant and solitary thud, and the job is done, for now. One more soul in limbo.

  Events like this can make a man think. Some days Hixon wonders if the whole business isn’t just a waste of time, all these people hovering between the eternal present and their someday futures. He figures the first cryonaut will be resuscitated within 50 years. But then, he’s been saying that for 30 years, so who knows. Sometimes he feels like he’s living in a science fiction novel, except he can’t page forward to find out what the ending will be. Maybe people really will wait years in a metabolic coma, frozen in time, until science delivers the breakthroughs that will repair and revive the ragged versions of their former selves. Who could say? That is the thing about Alcor: Every story is unresolved—every life…suspended.

  Hixon knows the routine better than anyone. Just as he knows that someday, someone else will be pushing the button for him, and on that day the roof will again open to the heavenly blue Arizona sky, and the crane will hoist him to his glistening dewar, where he too will slip into his appointed slot for a very long nap.

  2 | BOOMERS, BREAKTHROUGHS, AND FOUNTAINS OF YOUTH

  For as long as Homo sapiens have existed, we have been trying to snip the gnarly snares of mortality: myth, religion, cults— heaven, Elysium, Valhalla, Nirvana, Heaven’s Gate.2 Alcor is just the latest example. Everyone knows Alcor’s Final Protocols are a last-ditch effort to avoid death: a hedged bet, plan B. Even Max More admitted he didn’t long to join in Alcor’s Final Protocols. Better to enjoy a life where aging and death didn’t exist: plan A.

  No one yearned for plan A more than baby boomers. Just as they began hitting their 60s, the idea of super-longevity began rising up like some collectively unconscious Greek chorus. That was just around the time the Pew Research Center poll came out, and all those magazine covers began showing up on newsstands. National Geographic’s cover read, “This Baby Will Live to Be 120.” The Atlantic asked, “What Happens When We All Live to Be 100?” Time proclaimed: “This Baby Could Live to Be 142 Years Old.”

  Truthfully, the Time article didn’t reveal anything terribly new about how one might live 142 years. It rehashed information about telomeres, dispensed advice on the best places to live in your later years, nodded to the ways diets low in red meat and sugar and high in good fat seemed to slow heart disease, and revealed how rapamycin, a drug developed to reduce organ rejection in transplant patients, helped a lab mouse named UT2598 live eight months longer than normal. That sort of thing. All good and useful information, but certainly nothing that warranted the proclamations on the magazine’s cover.

  Why all the hoopla? Because the articles, books, and studies were tapping into the deepest fears of baby boomers. Boomers were a peculiar generation. They had emerged as the result of a massive case of pent-up, postwar lovemaking. For decades, child rearing had taken a weary backseat to the scarcity and menace of the Great Depression and World War II. But then, years of coitus interruptus gave way to a great blossoming of coitus semper.

  In 1945, for the first time in nearly 20 years, the future looked like one lovely bed of roses, at least in America. The U.S. economy boomed, jobs soared, money flowed, and newborns arrived in great cherubic waves. By 1957, an American baby was being born every seven minutes, and by 1964, the statisticians had counted 76.4 million new children in the United States since the end of the war. Boomers soon made up almost 40 percent of the nation’s population—and not one of them had yet reached age 20! In 1966, Time magazine made boomers its “Persons of the Year.” Fifty years later, they were still plowing their immense demographic girth through the world’s markets and culture like a pig through a python.

  But now, boomers were growing—how could one put it delicately?—old. Between 2020 and 2035, the population of Americans age 55 to 64 was projected to grow a whopping 73 percent. The ruddy, glowing complexions and slim bodies of their Woodstock days had deserted them. And being a group that associated itself with making (or breaking) the rules and discarding the status quo, they did not much care for that. The very idea that they were actually mortal collided with their self-image as game changers: a generation whose youth and energy and power had always allowed them to accomplish just about anything.

  What boomers didn’t invent, they popularized to the point of transforming the very idea of youth into an immense and ever growing industry. In 2012 when Arianna Huffington, president of Huffington Post, hosted a roundtable on aging with celebrated authors like Gail Sheehy, she called boomers the wealthiest, most active generation ever. All their lives, they grew up genuinely expecting the world to improve as time passed. And obstacles like aging and dying just didn’t fit in with the picture.

  A fierce fusion began bubbling up, a boundless, generational desire for an all-out assault on the most hated enemy of humankind: aging. There was the vague but palpable hope that death and decrepitude didn’t have to be inevitable, that living not simply longer (like their parents), but better, stronger, wiser, and happier could somehow be in the cards.

  And every day, more attempts surfaced. You only had to look at the World Congress on Anti-Aging Medicine for proof. In the early 1990s, the convention amounted to nothing more than a trickle in the domain of medical purveyors. These days visitors arrive at the Congress by the thousands, and the marketeers make it clear that noth
ing is beyond the reach of modern medicine as it marches forward to advance “scientific and medical technologies for the early detection, prevention, treatment, and reversal of age-related dysfunction, disorders, and diseases.” In 2014, the convention was such a big deal that J. Craig Venter, the master of genomics himself and winner of the 2008 National Medal of Science, delivered the keynote.

  By the end of 2015, the once tiny trickle of the global antiaging market had risen to $292 billion. Americans were turning 50 every seven seconds—12,500 people a day—and they wanted rejuvenation! Three out of every five consumers were taking supplements on a regular basis, with global sales topping $132 billion and growing at an 8 percent clip every year. Botox, the number one cosmetic procedure, was performed 2.8 million times in 2014, up 157 percent since 2002. And more Botox was in the pipeline. The same year, 54 million exercisers were zipping around the strip malls of America to sweat over row after row of ellipticals and bodybuilding machines.

  Boomers themselves were not alone. Forty-five percent of all cosmetic procedures in 2014 were performed on people between ages 35 and 50—gen Xers and gen-Ys. Could gen-Zs be far behind?

  But the real headline was that people over 50 now controlled 70 percent of America’s financial assets, and 50 percent of its discretionary income. Even the financial analysts over at Merrill Lynch couldn’t quite believe that the U.S. longevity investment sector would top seven trillion dollars in 2017, making it the world’s third largest economy. It was like a great and ever inflating balloon.

  Except that no one had yet found a way to truly stop time’s clock. Alcor, after all, was clearly not delivering a solution for life everlasting. It wasn’t as if millions were lining up for inclusion in the Chill Chamber over on East Acoma Drive. People wanted something more: They longed for the Big Breakthroughs.

  But to be blunt, no such breakthroughs existed. Boomers and their descendants may have wanted them, and the media certainly wanted to see them delivered, but desire—no matter how ardent—hadn’t yet provided anything that said, “Ah-ah! There is the path! The cure!” It wasn’t even clear such a thing was biologically possible. As recently as 2015, articles in magazines like Science were still quoting researchers like Derrick Rossi at Harvard saying, “We age so completely and in so many ways. We are programmed to die.” Well, who wanted to hear that?

  3 | THE DRIVE TO SURVIVE

  None of us can comprehend how the human race might manage living 300 or 400 years, or any other outrageously long time, without first understanding the social and scientific forces that have made the idea of it possible in the first place. That begins with explaining why we die.

  In 1899, tuberculosis exterminated more people in the United States than anything else. Its killing was hideous, and easy to spread: a bacterial infection that essentially shredded the lungs and ravaged the body. The white plague, they called it. After tuberculosis came the next biggest killers: pneumonia, diarrhea, and gastroenteritis.

  This was one reason the average white American lived only 48 years, the average black American 34 at most—just 15 years longer than our ancestors had survived during their days wandering the plains of Africa. Three hundred millennia of evolution, 10,000 years of civilization, and all the human race had to show for it was a meager 15 years of additional longevity!

  By the time Henry Ford rolled out his first Model Ts and the fox-trot was all the rage, the average U.S. citizen was lucky to make it past his fifth birthday. One out of four children died of typhus, pneumonia, or scarlet or rheumatic fever, vanishing at the rate of 10 to 35 percent a year. The simplest accident could snatch a person’s life. A worker might gash his hand at the factory, and die not long afterward of blood poisoning. In 1900, even the most advanced members of the medical arts would never in their wildest imaginations have considered that the average human could live 80 years.

  And why would they? When archaeologists pored over the writings of healers from Mesopotamia, Egypt, India, and Israel, they found plenty about migraines, seizures, smallpox, cholera, dropsy, and leprosy, but precious little about cancer, diabetes, heart disease, stroke, or dementia. Why? Because aging never had time enough to get a toehold. There were far too many other ways to die.

  But then, as the 20th century marched on, the statisticians who charted the nation’s actuarial tables began to notice people were living longer. Significantly longer. At first, this was mostly thanks to simple advances in sanitation. Water was cleaner. And milk, a major source of infectious bacteria, was pasteurized.

  In 1890, the first American sewage treatment plant using chemical precipitation was built in Worcester, Massachusetts. Large sanitation projects in big cities throughout Europe and America followed in the early 20th century, and chlorination was adopted in many cities after it was used to stem a typhoid fever epidemic in England in 1905.

  Medical care improved too. Whereas surgeons as recently as the early 20th century didn’t think twice about eating a sandwich while performing an amputation in the operating theaters of the day, they had learned by World War I that there was a connection between medical sanitation and the appalling number of deaths they had personally created. In fact, throughout society, the role that germs played in disease became better understood. The modern world grew cleaner, if not perfectly safe, from hospitals to restaurants to the workplace. Even the white plague began to vanish. By 1940, cases of tuberculosis in the United States plummeted by half.

  The next big life extender was antibiotics. Even after improvements in sanitation, the really ugly killers were still infectious diseases. Often, the only barrier between life and a horrible death was the strength of whatever a person’s DNA and immune system had the good fortune to bestow. Then, in 1928, a British biologist named Alexander Fleming noticed something odd as he gazed through the microscope in his lab at St. Mary’s Hospital in London: The bacteria he was studying had stopped growing in their petri dishes. The reason: A few spores of a green mold called Penicillium notatum had accidentally gotten into the same dish.

  Scientists already knew that certain molds and bacteria didn’t get along. They had been waging predatory war with one another at the cellular level far longer than the human race had been around—probably billions of years. But thanks to this new bit of information, Fleming suspected the green mold could be used to kill bacteria outright—maybe whole battalions of bacteria. “When I woke up just after dawn on September 28, 1928, I certainly didn’t plan to revolutionize all medicine by discovering the world’s first antibiotic,” Fleming later said. “But I guess that was exactly what I did.” He called the substance “mold juice,” but later named it penicillin.

  Now all Fleming needed to do was create a vaccine or drug of some kind. But not being a chemist, his repeated efforts floundered. It took 12 more years—and the insights of an Australian pharmacologist named Howard Florey and a German-British biochemist named Ernst Chain—to manage that. In 1941, they purified enough penicillin to treat their first patient. It took three more years before the drug could be produced in bulk and applied the way doctors use antibiotics today.

  Scientific techniques now snowballed, and waves of more vaccines and antibacterial drugs followed: chloramphenicol in 1947, tetracycline in 1948, the first safe vaccine for polio in 1952. Between 1940 and 1950, the number of medicines that doctors commonly used more than doubled, and nefarious diseases that had been killing human beings since time immemorial fell like dominoes. Life expectancy leaped forward. Between 1900 and World War II, the average American’s life span increased 26 years, nearly twice more than it had in the previous 300,000 years.

  Nevertheless, people still died. But now they were dying later, and from different diseases. In 1899, cancer was not even listed among the top five killers in the United States.3 It was so rare that when a respected surgeon named Roswell Park argued that cancer would someday become the nation’s leading cause of death, the medical community thought he had lost touch with reality. And yet by 1950, cancer took its place
as the nation’s second leading killer. In the space of a single generation, the number of people surviving beyond 60 had nearly doubled. Good news—except now formerly rare diseases like heart attacks, cancer, and stroke were increasing. Longer life had created a new class of killers.

  This situation gave rise to something entirely new: gerontology. Élie Metchnikoff, a Russian Nobel laureate and pioneer in immunology, had coined the term (literally the “study of old men”) in 1903—but in those days, there really wasn’t much need for the field, because so few people actually grew old. Now, all of that was changing. Organizations like the Gerontological Society of America were formed, and pioneers like James Birren began to study how the body and brain aged.

  The field quickly branched into examining anything at all related to advancing age: pharmacology, public health, and the psychological effects, economics, and sociology of aging. Yet not one of its practitioners—not even the biological branch—concerned itself with what actually caused aging, or what could be done to prevent it. Even the sister field of geriatrics focused only on treating and reacting to the inevitable deterioration of aging: problems like loss of memory, mobility, and strength, and diseases like osteoporosis, arthritis, heart disease, diabetes—whatever began to break down the body. But no one seemed in any way concerned with what could actually stop aging.

  To gerontologists, the reasons for this were obvious, because everyone knew that aging was simply something the body did. No respectable doctor gave any real thought to how one might arrest it. After all, everything, everywhere broke down, given enough time. Bridges, roads, machines, dogs, cats. Even mountains and valleys. It was entropy at work. The great circle of life. Why should we humans be any different?

 

‹ Prev