• It contains thousands of “pseudogenes”—genes that were once functional but have become nonfunctional, i.e., they give rise to no protein or RNA. The carcasses of these inactivated genes are littered throughout its length, like fossils decaying on a beach.
• It accommodates enough variation to make each one of us distinct, yet enough consistency to make each member of our species profoundly different from chimpanzees and bonobos, whose genomes are 96 percent identical to ours.
• Its first gene, on chromosome one, encodes a protein that senses smell in the nose (again: those ubiquitous olfactory genes!). Its last gene, on chromosome X, encodes a protein that modulates the interaction between cells of the immune system. (The “first” and “last” chromosomes are arbitrarily assigned. The first chromosome is labeled first because it is the longest.)
• The ends of its chromosomes are marked with “telomeres.” Like the little bits of plastic at the ends of shoelaces, these sequences of DNA are designed to protect the chromosomes from fraying and degenerating.
• Although we fully understand the genetic code—i.e., how the information in a single gene is used to build a protein—we comprehend virtually nothing of the genomic code—i.e., how multiple genes spread across the human genome coordinate gene expression in space and time to build, maintain, and repair a human organism. The genetic code is simple: DNA is used to build RNA, and RNA is used to be build a protein. A triplet of bases in DNA specifies one amino acid in the protein. The genomic code is complex: appended to a gene are sequences of DNA that carry information on when and where to express the gene. We do not know why certain genes are located in particular geographic locations in the genome, and how the tracts of DNA that lie between genes regulate and coordinate gene physiology. There are codes beyond codes, like mountains beyond mountains.
• It imprints and erases chemical marks on itself in response to alterations in its environment—thereby encoding a form of cellular “memory” (there is more to come on this topic).
• It is inscrutable, vulnerable, resilient, adaptable, repetitive, and unique.
• It is poised to evolve. It is littered with the debris of its past.
• It is designed to survive.
• It resembles us.
PART FIVE
* * *
THROUGH THE LOOKING GLASS
The Genetics of Identity and “Normalcy”
(2001–2015)
How nice it would be if we could only get through into Looking-glass House! I’m sure it’s got, oh! such beautiful things in it!
—Lewis Carroll, Alice in Wonderland
“So, We’s the Same”
We got to have a re-vote. This ain’t right.
—Snoop Dogg, on discovering that he has more European ancestry than Charles Barkley
What have I in common with Jews? I have hardly anything in common with myself.
—Franz Kafka
Medicine, the sociologist Everett Hughes once observed wryly, perceives the world through “mirror writing.” Illness is used to define wellness. Abnormalcy marks the boundaries of normalcy. Deviance demarcates the limits of conformity. This mirror writing can result in an epically perverse vision of the human body. An orthopedist thus begins to think of bones as sites of fractures; a brain, in a neurologist’s imagination, is a place where memories are lost. There is an old story, probably apocryphal, of a Boston surgeon who did lose his memory and could only recall his friends by the names of the various operations he had performed on them.
Throughout much of the history of human biology, genes too were largely perceived in mirror writing—identified by the abnormality or disease caused when they mutated. Hence the cystic fibrosis gene, the Huntington’s gene, the breast-cancer-causing BRCA1 gene, and so forth. To a biologist the nomenclature is absurd: the function of the BRCA1 gene is not to cause breast cancer when mutated, but to repair DNA when normal. The sole function of the “benign” breast cancer gene BRCA1 is to make sure that DNA is repaired when it is damaged. The hundreds of millions of women without a family history of breast cancer inherit this benign variant of the BRCA1 gene. The mutant variant or allele—call it m-BRCA1—causes a change in the structure of the BRCA1 protein such that it fails to repair damaged DNA. Hence cancer-causing mutations arise in the genome when BRCA1 malfunctions.
The gene called wingless in fruit flies encodes a protein whose real function is not to make wingless insects, but to encode instructions to build wings. Naming a gene cystic fibrosis (or CF), as the science writer Matt Ridley once described it, is “as absurd as defining the organs of the body by the diseases they get: livers are there to cause cirrhosis, hearts to cause heart attacks, and brains to cause strokes.”
The Human Genome Project allowed geneticists to invert this mirror writing on itself. The comprehensive catalog of every normal gene in the human genome—and the tools generated to produce such a catalog—made it possible, in principle, to approach genetics from the front side of its mirror: it was no longer necessary to use pathology to define the borders of normal physiology. In 1988, a National Research Council document on the Genome Project made a crucial projection about the future of genomic research: “Encoded in the DNA sequence are fundamental determinants of those mental capacities—learning, language, memory—essential to human culture. Encoded there as well are the mutations and variations that cause or increase the susceptibility to many diseases responsible for much human suffering.”
Vigilant readers may have noted that the two sentences signaled the twin ambitions of a new science. Traditionally, human genetics had concerned itself largely with pathology—with “diseases responsible for much human suffering.” But armed with new tools and methods, genetics could also roam freely to explore aspects of human biology that had hitherto seemed impenetrable to it. Genetics had crossed over from the strand of pathology to the strand of normalcy. The new science would be used to understand history, language, memory, culture, sexuality, identity, and race. It would, in its most ambitious fantasies, try to become the science of normalcy: of health, of identity, of destiny.
The shift in the trajectory of genetics also signals a shift in the story of the gene. Until this point, the organizing principle of our story has been historical: the journey from the gene to the Genome Project has moved through a relatively linear chronology of conceptual leaps and discoveries. But as human genetics shifted its glance from pathology to normalcy, a strictly chronological approach could no longer capture the diverse dimensions of its inquiry. The discipline shifted to a more thematic focus, organizing itself around distinct, albeit overlapping, arenas of inquiry concerning human biology: the genetics of race, gender, sexuality, intelligence, temperament, and personality.
The expanded dominion of the gene would vastly deepen our understanding of the influence of genes on our lives. But the attempt to confront human normalcy through genes would also force the science of genetics to confront some of the most complex scientific and moral conundrums in its history.
To understand what genes tell us about human beings, we might begin by trying to decipher what genes tell us about the origins of human beings. In the mid-nineteenth century, before the advent of human genetics, anthropologists, biologists, and linguists fought furiously over the question of human origin. In 1854, a Swiss-born natural historian named Louis Agassiz became the most ardent proponent of a theory called polygenism, which suggested that the three major human races—Whites, Asians, and Negroes, as he liked to categorize them—had arisen independently, from separate ancestral lineages, several million years ago.
Agassiz was arguably the most distinguished racist in the history of science—“racist” both in the original sense of the word, a believer in the inherent differences between human races, and in an operational sense, a believer that some races were fundamentally superior to others. Repulsed by the horror that he might share a common ancestor with Africans, Agassiz maintained that each race had its unique forefather and for
emother and had arisen independently and forked out independently over space and time. (The name Adam, he suggested, arose from the Hebrew word for “one who blushes,” and only a white man could detectably blush. There had to be multiple Adams, Agassiz concluded—blushers and nonblushers—one for each race.)
In 1859, Agassiz’s theory of multiple origins was challenged by the publication of Darwin’s Origin of Species. Although Origin pointedly dodged the question of human origin, Darwin’s notion of evolution by natural selection was obviously incompatible with Agassiz’s separate ancestry of all human races: if finches and tortoises had cascaded out from a common ancestor, why would humans be different?
As academic duels go, this one was almost comically one-sided. Agassiz, a grandly bewhiskered Harvard professor, was among the most prominent natural historians of the world, while Darwin, a doubt-ridden, self-taught parson-turned-naturalist from the “other” Cambridge, was still virtually unknown outside England. Still, recognizing a potentially fatal confrontation when he saw one, Agassiz issued a scalding rebuttal to Darwin’s book. “Had Mr. Darwin or his followers furnished a single fact to show that individuals change, in the course of time, in such a manner as to produce, at last, species . . . the state of the case might be different,” he thundered.
But even Agassiz had to concede his theory of separate ancestors for separate races ran the risk of being challenged not by a “single fact” but a multitude of facts. In 1848, stone diggers in a limestone quarry in the Neander Valley in Germany had accidentally unearthed a peculiar skull that resembled a human skull but also possessed substantial differences, including a larger cranium, a recessed chin, powerfully articulated jawbones, and a pronounced outward jut of the brow. At first, the skull was dismissed as the remnant of a freak that had suffered an accident—a madman stuck in a cave—but over the next decades, a host of similar skulls and bones were disgorged from gorges and caves scattered across Europe and Asia. The bone-by-bone reconstruction of these specimens pointed to a strongly built, prominently browed species that walked upright on somewhat bowed legs—an ornery wrestler with a permanent frown. The hominid was called a Neanderthal, after the site of its original location.
Initially, many scientists believed that the Neanderthals represented an ancestral form of modern humans, one link in the chain of missing links between humans and apes. In 1922, for instance, an article in Popular Science Monthly called the Neanderthal “an early time in the evolution of man.” Accompanying the text was a variant of a now-familiar image of human evolution, with gibbonlike monkeys transmuting into gorillas, gorillas into upright-walking Neanderthals, and so forth, until humans were formed. But by the 1970s and 1980s, the Neanderthal-as-human-ancestor hypothesis had been debunked and replaced by a much stranger idea—that early modern humans coexisted with Neanderthals. The “chain of evolution” drawings were revised to reflect that gibbons, gorillas, Neanderthals, and modern humans were not progressive stages of human evolution, but had all emerged from a common ancestor. Further anthropological evidence suggested that modern humans—then called Cro-Magnons—had arrived on the Neanderthal’s scene around forty-five thousand years ago, most likely by migrating into parts of Europe where Neanderthals lived. We now know that Neanderthals had become extinct forty thousand years ago, having overlapped with early modern humans for about five thousand years.
Cro-Magnons are, indeed, our closer, truer ancestors, possessing the smaller skull, flattened face, receded brow, and thinner jaw of contemporary humans (the politically correct phrase for the anatomically correct Cro-Magnons is European Early Modern Human or EEMH). These early modern humans intersected with Neanderthals, at least in parts of Europe, and likely competed with them for resources, food, and space. Neanderthals were our neighbors and rivals. Some evidence suggests that we interbred with them and that in competing with them for food and resources we may have contributed to their extinction. We loved them—and, yes, we killed them.
But the distinction between Neanderthals and modern humans returns us, full circle, to our original questions: How old are humans, and where did we come from? In the 1980s, a biochemist at the University of California, Berkeley, named Allan Wilson began to use genetic tools to answer these questions.I Wilson’s experiment began with a rather simple idea. Imagine being thrown into a Christmas party. You don’t know the host or the guests. A hundred men, women, and children are milling around, drinking punch, and suddenly a game begins: You are asked to arrange the crowd by family, relatedness, and descent. You cannot ask for names or ages. You are blindfolded; you are not allowed to construct family trees by looking at facial resemblances or studying mannerisms.
To a geneticist, this is a tractable problem. First, he recognizes the existence of hundreds of natural variations—mutations—scattered through each individual genome. The closer the relatedness of individuals, the closer the spectrum of their variants or mutations shared by them (identical twins share the entire genome; fathers and mothers contribute, on average, half to their children, and so forth). If these variants can be sequenced and identified in each individual, lineage can be solved immediately: relatedness is a function of mutatedness. Just as facial features or skin color or height are shared among related individuals, variations are more commonly shared within families than across families (indeed, facial features and heights are shared because genetic variations are shared among individuals).
And what if the geneticist is also asked to find the family with the most generations present, without knowing the ages of any of the individuals at the party? Suppose one family is represented by a great-grandfather, grandfather, father, and son at the celebration; four generations are present. Another family also has four attendees—a father and his identical triplets, representing just two generations. Can we identify the family with the most generations in the crowd with no prior knowledge about faces or names? Merely counting the number of members in a family will not do: the father-and-triplet family, and the great-grandfather and his multigenerational descendants, each have the same number of family members: four.
Genes and mutations provide a clever solution. Since mutations accumulate over generations—i.e., over intergenerational time—the family with the greatest diversity in gene variations is the one with the most generations. The triplets have exactly the same genome; their genetic diversity is minimal. The great-grandfather and great-grandson pair, in contrast, have related genomes—but their genomes have the most differences. Evolution is a metronome, ticktocking time through mutations. Genetic diversity thus acts as a “molecular clock,” and variations can organize lineage relationships. The intergenerational time between any two family members is proportional to the extent of genetic diversity between them.
Wilson realized that this technique could be applied not just across a family, but across an entire population of organisms. Variations in genes could be used to create a map of relatedness. And genetic diversity could be used to measure the oldest populations within a species: a tribe that has the most genetic diversity within it is older than a tribe with little or no diversity.
Wilson had almost solved the problem of estimating the age of any species using genomic information—except for a glitch. If genetic variation was produced only by mutation, Wilson’s method would be absolutely fail-safe. But genes, Wilson knew, are present in two copies in most human cells, and they can “cross over” between paired chromosomes, generating variation and diversity by an alternative method. This method of generating variation would inevitably confound Wilson’s study. To construct an ideal genetic lineage, Wilson realized, he needed a stretch of the human genes that was intrinsically resistant to reassortment and crossing over—a lonely, vulnerable corner of the genome where change can occur only through the accumulation of mutations, thereby allowing that genomic segment to act as the perfect molecular clock.
But where might he find such a vulnerable stretch? Wilson’s solution was ingenious. Human genes are stored in chromosomes in the nucl
eus of the cell, but with one exception. Every cell possesses a subcellular structure called a mitochondrion that is used to generate energy. Mitochondria have their own mini-genome, with only thirty-seven genes, about one six-thousandth the number of genes on human chromosomes. (Some scientists propose that mitochondria originated from some ancient bacteria that invaded single-celled organisms. These bacteria formed a symbiotic alliance with the organism; they provided energy, but used the organism’s cellular environment for nutrition, metabolism, and self-defense. The genes lodged within mitochondria are left over from this ancient symbiotic relationship; indeed, human mitochondrial genes resemble bacterial genes more than human ones.)
The mitochondrial genome rarely recombines and is only present in a single copy. Mutations in mitochondrial genes are passed intact across generations, and they accumulate over time without crossing over, making the mitochondrial genome an ideal genetic timekeeper. Crucially, Wilson realized, this method of age reconstruction was entirely self-contained and independent of bias: it made no reference to the fossil record, to linguistic lineages, geologic strata, geographical maps, or anthropological surveys. Living humans are endowed with the evolutionary history of our species in our genomes. It is as if we permanently carry a photograph of each of our ancestors in our wallets.
Between 1985 and 1995, Wilson and his students learned to apply these techniques to human specimens (Wilson died of leukemia in 1991, but his students continued his work). The results of these studies were startling for three reasons. First, when Wilson measured the overall diversity of the human mitochondrial genome, he found it to be surprisingly small—less diverse than the corresponding genomes of chimpanzees. Modern humans, in other words, are substantially younger and substantially more homogenous than chimpanzees (every chimp might look like every other chimp to human eyes, but to a discerning chimpanzee, it is humans that are vastly more alike). Calculating backward, the age of humans was estimated to be about two hundred thousand years—a minor blip, a ticktock, in the scale of evolution.
The Gene Page 39