Book Read Free

The Gene

Page 59

by Siddhartha Mukherjee


  On the way back from Moni’s institution in Calcutta, my father wanted to stop again outside the house where he had grown up—the place where they brought Rajesh back in the throes of his mania, thrashing like a wild bird. We drove in silence. His memories had formed the walls of a room around him. We left the car at the narrow inlet of Hayat Khan Lane, and walked into the cul-de-sac on foot. It was about six in the evening. The houses were lit by an oblique, smoky light, and the air threatened rain.

  “Bengalis have only one event in their history: Partition,” my father said. He was looking up at the balconies jutting above us and trying to recall the names of his former neighbors: Ghosh, Talukdar, Mukherjee, Chatterjee, Sen. A gentle drizzle began to descend on us—or perhaps it was just the drippings from the laundry, hanging thickly on clotheslines tacked across the houses. “Partition was the defining event for every man and woman in this city,” he said. “Either you lost your own home, or your home became a shelter for someone else.” He pointed to the colonnades of windows above our heads. “Every family here had another family living within it.” There were households within households, rooms inside rooms, microcosms lodged inside microcosms.

  “When we arrived here from Barisal, with our four steel trunks and our few salvaged possessions, we thought we were beginning a new life. We had experienced a catastrophe, but it was also a fresh start.” Every house on that street, I knew, had its own story of steel trunks and salvaged possessions. It was as if all the inhabitants had been equalized, like a garden cut back to the roots in the winter.

  For a cohort of men, including my father, the journey from East to West Bengal involved a radical resetting of all clocks. Thus began Year Zero. Time was splintered into two halves: the era before the cataclysm, and the era after. BP and AP. This vivisection of history—the partition of Partition—produced a strangely dissonant experience: the men and women of my father’s generation perceived themselves as unwitting participants in a natural experiment. Once the clocks had been reset to zero, it was as if you could watch the lives, fates, and choices of human beings played out from some starting gate, or from the beginning of time. My father had experienced this experiment all too acutely. One brother had devolved into mania and depression. Another’s sense of reality had been shattered. My grandmother had acquired her lifelong suspicion of all forms of change. My father had acquired his taste for adventure. It seemed as if distinct futures—like homunculi—had been folded into each person, waiting to be unfurled.

  What force, or mechanism, might explain such widely divergent fates and choices of individual human beings? In the eighteenth century, an individual’s destiny was commonly described as a series of events ordained by God. Hindus had long believed that a person’s fate was derived, with near-arithmetic precision, by some calculus of the good and evil acts that he had performed in a previous life. (God, in this scheme, was a glorified moral tax-accountant, tallying and divvying out portions of good and bad fate based on past investments and losses.) The Christian God, capable of inexplicable compassion and equally inexplicable wrath, was a more mercurial bookkeeper—but He too was the ultimate, if more inscrutable, arbiter of destiny.

  Nineteenth- and twentieth-century medicine offered more secular conceptions of fate and choice. Illness—perhaps the most concrete and universal of all acts of fate—could now be described in mechanistic terms, not as the arbitrary visitation of divine vengeance, but as the consequence of risks, exposures, predispositions, conditions, and behaviors. Choice was understood as an expression of an individual’s psychology, experiences, memories, traumas, and personal history. By the mid-twentieth century, identity, affinity, temperament, and preference (straightness versus gayness or impulsivity versus caution) were increasingly described as phenomena caused by the intersections of psychological impulses, personal histories, and random chance. An epidemiology of destiny and choice was born.

  In the early decades of the twenty-first century, we are learning to speak yet another language of cause and effect, and constructing a new epidemiology of self: we are beginning to describe illness, identity, affinity, temperament, preferences—and, ultimately, fate and choice—in terms of genes and genomes. This is not to make the absurd claim that genes are the only lenses through which fundamental aspects of our nature and destiny can be viewed. But it is to propose and to give serious consideration to one of the most provocative ideas about our history and future: that the influence of genes on our lives and beings is richer, deeper, and more unnerving than we had imagined. This idea becomes even more provocative and destabilizing as we learn to interpret, alter, and manipulate the genome intentionally, thereby acquiring the ability to alter future fates and choices. “[Nature] may, after all, be entirely approachable,” Thomas Morgan wrote in 1919. “Her much-advertised inscrutability has once more been found to be an illusion.” We are now trying to extend Morgan’s conclusions—not just to nature, but to human nature.

  I have often thought about the possible trajectories of Jagu’s and Rajesh’s lives if they had been born in the future, say fifty or a hundred years from now. Would our knowledge of their heritable vulnerabilities be used to find cures for the illnesses that had devastated their lives? Would that knowledge be used to “normalize” them—and if so, what moral, social, and biological hazards would that entail? Would such forms of knowledge enable new kinds of empathy and understanding? Or would they nucleate novel forms of discrimination? Would the knowledge be used to redefine what is “natural”?

  But what is “natural”? I wonder. On one hand: variation, mutation, change, inconstancy, divisibility, flux. And on the other: constancy, permanence, indivisibility, fidelity. Bhed. Abhed. It should hardly surprise us that DNA, the molecule of contradictions, encodes an organism of contradictions. We seek constancy in heredity—and find its opposite: variation. Mutants are necessary to maintain the essence of our selves. Our genome has negotiated a fragile balance between counterpoised forces, pairing strand with opposing strand, mixing past and future, pitting memory against desire. It is the most human of all things that we possess. Its stewardship may be the ultimate test of knowledge and discernment for our species.

  * * *

  I. To understand how genes become actualized into organisms, it is necessary to understand not just genes, but also RNA, proteins, and epigenetic marks. Future studies will need to reveal how the genome, all the variants of proteins (the proteome), and all the epigenetic marks (the epigenome) are coordinated to build and maintain humans.

  II. Comprehensive testing of fetal genomes has already entered clinical practice, under the name of NIPT, or Non-Invasive Prenatal Testing. In 2014, a Chinese company reported that it had tested 150,000 fetuses for chromosomal disorders, and was extending the test to capture single-gene mutations. Although these tests seem to detect chromosomal abnormalities, such as Down syndrome, with just as much fidelity as amniocentesis, a major issue of the test is “false positives”—i.e., the fetal DNA is thought to carry a chromosomal abnormality, but it is actually normal. These false positive rates will decrease dramatically as technologies advance.

  III. Even seemingly simple scenarios of genetic screening force us to enter arenas of unnerving moral hazard. Take Friedman’s example of using a blood test to screen soldiers for genes that predispose to PTSD. At first glance, such a strategy would seem to mitigate the trauma of war: soldiers incapable of “fear extinction” might be screened and treated with intensive psychiatric therapies or medical therapies to return them to normalcy. But what if, extending the logic, we screen soldiers for PTSD risk before deployment? Would that really be desirable? Do we truly want to select soldiers incapable of registering trauma, or genetically “augmented” with the capacity to extinguish the psychic anguish of violence? Such a form of screening would seem to me to be precisely undesirable: a mind incapable of “fear extinction” is exactly the dangerous sort of mind to be avoided in war.

  This homunculus, wrapped inside human sperm, was drawn
by Nicolaas Hartsoeker in 1694. Like many other biologists in his time, Hartsoeker believed in “spermism,” the theory that the information to create a fetus was transmitted by the miniature human form lodged inside sperm.

  In medieval Europe, “trees of lineage” were often created to mark the ancestors and descendants of noble families. These trees were used to stake claims on peerage and property, or to seek marital arrangements between families (in part, to decrease the chances of consanguineous marriages between cousins). The word gene—at the top left corner—was used in the sense of genealogy or descent. The modern connotation of gene, as a unit of hereditary information, appeared centuries later in 1909.

  Charles Darwin (here in his seventies) and his “tree of life” sketch, showing organisms radiating out from a common ancestral organism (note the doubt-ridden phrase “I think,” scribbled above the diagram). Darwin’s theory of evolution by variation and natural selection demanded a theory of heredity via genes. Close readers of Darwin’s theory realized that evolution could work only if there were indivisible, but mutable, particles of heredity that transmit information between parents and offspring. Yet Darwin, having never read Gregor Mendel’s paper, never found an adequate formulation of such a theory during his lifetime.

  Gregor Mendel holds a flower, possibly from a pea plant, in his monastery garden in Brno (now in the Czech Republic). Mendel’s seminal experiments in the 1850s and ’60s identified indivisible particles of information as carriers of hereditary information. Mendel’s paper (1865) was largely ignored for four decades, and then transformed the science of biology.

  William Bateson’s “rediscovery” of Mendel’s work in 1900 converted him into a believer in genes. Bateson coined the term genetics in 1905 to describe the study of heredity. Wilhelm Johannsen (left) coined the term gene to describe a unit of heredity. Johannsen visited Bateson at his house in Cambridge, England; the two became close collaborators and vigorous defenders of the gene theory.

  Francis Galton—mathematician, biologist, and statistician—put himself on one of his own “anthropometry cards,” in which he tabulated a person’s height, weight, facial features, and other characteristics. Galton resisted Mendel’s theory of genes. He also believed that the selective breeding of humans with the “best” features would lead to the creation of an improved human race. Eugenics, a term coined by Galton for the science of human emancipation through the manipulation of heredity, would soon morph into a macabre form of social and political control.

  The Nazi doctrine of “racial hygiene” prompted a vast state-sponsored effort to cleanse the human race through sterilization, confinement, and murder. Twin studies were used to prove the power of hereditary influences, and men, women, and children were exterminated based on an assumption that they carried defective genes. The Nazis extended their eugenic efforts to exterminate Jews, Gypsies, dissidents, and homosexuals. Here, Nazi scientists measure the height of twins, and demonstrate family history charts to Nazi recruits.

  Better Babies contests were introduced in the United States in the 1920s. Doctors and nurses examined children (all white) for the best genetic features. Better Babies contests generated passive support for eugenics in America by showcasing the healthiest babies as products of genetic selection.

  A “eugenics tree” cartoon from the United States argues for the “self-direction of human evolution.” Medicine, surgery, anthropology, and genealogy are the “roots” of the tree. Eugenic science hoped to use these foundational principles to select fitter, healthier, and more accomplished humans.

  In the 1920s, Carrie Buck and her mother, Emma Buck, were sent to the Virginia State Colony for Epileptics and Feebleminded, where women classified as “imbeciles” were routinely sterilized. The photograph, obtained on the pretext of capturing a casual moment between mother and daughter, was staged to provide evidence of the resemblance between Carrie and Emma, and thus proof of their “hereditary imbecility.”

  At Columbia University, and subsequently at Caltech University in the 1920s and ’30s, Thomas Morgan used fruit flies to demonstrate that genes were physically linked to each other, presciently predicting that a single, chainlike molecule carried genetic information. Linkage between genes would eventually be used to generate genetic maps in humans and lay the foundation for the Human Genome Project. This is Morgan in his Caltech Fly Room, surrounded by the milk bottles in which he bred his maggots and flies.

  Rosalind Franklin looks down a microscope at King’s College in London in the 1950s. Franklin used X-ray crystallography to photograph and study the structure of DNA. Photograph 51 is the clearest of Franklin’s photographs of a DNA crystal. The photo suggested a double-helix structure, although the precise orientations of the bases A, C, T, and G were not clear from it.

  James Watson and Francis Crick demonstrate their model of DNA as a double helix in Cambridge in 1953. Watson and Crick solved the structure of DNA by realizing that the A in one strand was paired against the T in the other, and the G against the C.

  At the Moore Clinic in Baltimore in the 1950s, Victor McKusick created a vast catalog of human mutations. He found that one phenotype—short stature, or “dwarfism”—could be caused by mutations in several disparate genes. Conversely, diverse phenotypes could be caused by mutations in a single gene.

  Nancy Wexler’s mother and uncles were diagnosed with Huntington’s disease, a lethal neurodegenerative disease that spurs involuntary sinuous or jerking movements. The diagnosis launched her personal hunt for the gene that causes the illness. Wexler found a cluster of several patients with Huntington’s disease in Venezuela, all likely descended from a founder with the disease. Huntington’s disease was one of the first human diseases to be definitively linked to a single gene using modern gene-mapping methods.

  Students protest a genetics meeting in the 1970s. The novel technologies of gene sequencing, gene cloning, and recombinant DNA raised anxieties that new forms of eugenics would be used to create a “perfect race.” The link to Nazi eugenics was not forgotten.

  Herb Boyer (left) and Robert Swanson founded Genentech in 1976 to produce medicines out of genes. The drawing on the blackboard shows the scheme to produce insulin using recombinant DNA technology. The first such proteins were produced in enormous bacterial incubators under Swanson’s watchful eye.

  Paul Berg speaks to Maxine Singer at the Asilomar meeting in 1975, while Sydney Brenner takes notes in the background. Following the discovery of technologies to create genetic hybrids between genes (recombinant DNA) and produce millions of copies of these hybrids in bacterial cells (gene cloning), Berg and others proposed a “moratorium” on certain recombinant DNA work until the risks had been adequately assessed.

  Frederick Sanger examines a DNA sequencing gel. Sanger’s invention of a technique to sequence DNA (i.e., read the precise stretch of letters—A, C, T, and G—in a gene’s sequence) revolutionized our understanding of genes, and set the stage for the Human Genome Project.

  Jesse Gelsinger poses in Philadelphia a few months before his death in 1999. Gelsinger was one of the first patients to be treated with gene therapy. A virus was designed to deliver the correct form of a mutated gene into his liver, but Gelsinger had a brisk immunological response to the virus, resulting in organ failure and death. Gelsinger’s “biotech death” would spur nationwide responses to ensure the safety of gene-therapy trials.

  The February 2001 cover of Science magazine announced the draft sequence of the human genome.

  Craig Venter (left), President Bill Clinton, and Francis Collins announce the draft sequence of the human genome on June 26, 2000, at the White House.

  Even without subtle techniques to alter human genomes, the capacity to assess a child’s genome in utero has led to vast dysgenic efforts around the world. In parts of China and India, the assessment of male versus female gender by amniocentesis, and the selective abortion of female fetuses, has skewed the sex ratio to 0.8 females to 1 male, and caused unprecedented alterations of popu
lation and family structures.

  Faster and more accurate gene-sequencing machines (housed inside gray boxlike containers) linked to supercomputers that analyze and annotate genetic information can now sequence individual human genomes in months. Variations of this technique can be used to sequence the genome of a multicelled embryo or a fetus, enabling preimplantation genetic diagnosis and in utero diagnosis of future illness.

  Jennifer Doudna (right), a biologist and RNA researcher at Berkeley, is among those working on a system to deliver targeted, intentional mutations in genes. In principle, the system can be used to “edit” the human genome, although the technology still remains to be perfected and assessed for safety and fidelity. If intentional genetic changes were introduced into sperm, egg, or human embryonic stem cells, the technology would portend the genesis of humans with altered genes.

  Acknowledgments

  When I completed the final draft of the six-hundred-page Emperor of All Maladies in May 2010, I never thought I would lift a pen to write another book. The physical exhaustion of writing Emperor was easy to fathom and overcome, but the exhaustion of imagination was unexpected. When the book won the Guardian First Book Prize that year, one reviewer complained that it should have been nominated for the Only Book Prize. The critique cut to the bone of my fears. Emperor had sapped all my stories, confiscated my passports, and placed a lien on my future as a writer; I had nothing more to tell.

 

‹ Prev