The Violinist's Thumb: And Other Lost Tales of Love, War, and Genius, as Written by Our Genetic Code
Page 29
Thankfully, though, whenever science threatened to become the real HGP story, something juicy happened to distract everyone. For example, in early 2000 President Clinton announced, seemingly out of the clear blue sky, that the human genome belonged to all people worldwide, and he called on all scientists, including ones in the private sector, to share sequence information immediately. There were also whispers of the government eliminating gene patents, and investors with money in sequencing companies stampeded. Celera got trampled, losing $6 billion in stock value—$300 million of it Venter’s—in just weeks. As a balm against this and other setbacks, Venter tried around this time to secure a piece of Einstein’s brain, to see if someone could sequence its DNA after all,* but the plan came to naught.
Almost touchingly, a few people held out hope that Celera and the consortium could still work together. Sulston had put the kibosh on a cease-fire with Venter in 1999, but shortly thereafter other scientists approached Venter and Collins to broker a truce. They even floated the idea of the consortium and Celera publishing the 90-percent-complete rough draft of the human genome as one joint paper. Negotiations proceeded apace, but government scientists remained chary of Celera’s business interests and bristled over its refusal to publish data immediately. Throughout the negotiations, Venter displayed his usual charm; one consortium scientist swore in his face, countless others behind his back. A New Yorker profile of Venter from the time opened with a (cowardly anonymous) quote from a senior scientist: “Craig Venter is an asshole.” Not surprisingly, plans for a joint publication eventually disintegrated.
Appalled by the bickering, and eyeing an upcoming election, Bill Clinton finally intervened and convinced Collins and Venter to appear at a press conference at the White House in June 2000. There the two rivals announced that the race to sequence the human genome had ended—in a draw. This truce was arbitrary and, given the lingering resentments, largely bogus. But rather than growling, both Collins and Venter wore genuine smiles that summer day. And why not? It was less than a century after scientists had identified the first human gene, less than fifty years after Watson and Crick had elucidated the double helix. Now, at the millennium, the sequencing of the human genome promised even more. It had even changed the nature of biological science. Nearly three thousand scientists contributed to the two papers that announced the human genome’s rough draft. Clinton had famously declared, “The era of big government is over.” The era of big biology was beginning.
The two papers outlining the rough draft of the human genome appeared in early 2001, and history should be grateful that the joint publication fell apart. A single paper would have forced the two groups into false consensus, whereas the dueling papers highlighted each side’s unique approach—and exposed various canards that had become accepted wisdom.
In its paper, Celera acknowledged that it had poached the free consortium data to help build part of its sequence—which sure undermined Venter’s rebel street cred. Furthermore, consortium scientists argued that Celera wouldn’t even have finished without the consortium maps to guide the assembly of the randomly shotgunned pieces. (Venter’s team published angry rebuttals.) Sulston also challenged the Adam Smith–ish idea that the competition increased efficiency and forced both sides to take innovative risks. Instead, he argued, Celera diverted energy away from sequencing and toward silly public posturing—and sped up the release only of the “fake” rough draft anyway.
Of course, scientists loved the draft, however rough, and the consortium would never have pushed itself to publish one so soon had Venter not flipped his gauntlet at their face. And whereas the consortium had always portrayed itself as the adults here—the ones who didn’t care about speedy genomic hot-rodding, just accuracy—most scientists who examined the two drafts side by side proclaimed that Celera did a better job. Some said its sequence was twice as good and less riddled with virus contamination. The consortium also (quietly) put the lie to its criticisms of Venter by copying the whole-genome shotgun approach for later sequencing projects, like the mouse genome.
By then, however, Venter wasn’t around to bother the public consortium. After various management tussles, Celera all but sacked Venter in January 2002. (For one thing, Venter had refused to patent most genes that his team discovered; behind the scenes, he was a rather indifferent monomaniacal capitalist.) When Venter left, Celera lost its sequencing momentum, and the consortium claimed victory, loudly, when it alone produced a full human genome sequence in early 2003.*
After years of adrenalized competition, however, Venter, like a fading football star, couldn’t simply walk away. In mid-2002 he diverted attention from the consortium’s ongoing sequencing efforts by revealing that Celera’s composite genome had actually been 60 percent Venter sperm DNA; he had been the primary “anonymous” donor. And, undisturbed by the tsk-tsking that followed his revelation—“vainglorious,” “egocentric,” and “tacky” were some of the nicer judgments—Venter decided he wanted to analyze his pure DNA, unadulterated by other donors. To this end, he founded a new institute, the Center for the Advancement of Genomics (TCAG, har, har), that would spend $100 million over four years to sequence him and him alone.
This was supposed to be the first complete individual genome—the first genome that, unlike the Platonic HGP genome, included both the mother’s and father’s genetic contributions, as well as every stray mutation that makes a person unique. But because Venter’s group spent four whole years polishing his genome, base by base, a group of rival scientists decided to jump into the game and sequence another individual first—none other than Venter’s old nemesis, James Watson. Ironically, the second team—dubbed Project Jim—took a cue from Venter and tried to sweep away the prize with new, cheaper, dirtier sequencing methods, ripping through Watson’s full genome in four months and for a staggeringly modest sum, around $2 million. Venter, being Venter, refused to concede defeat, though, and this second genome competition ended, probably inevitably, in another draw: the two teams posted their sequences online within days of each other in summer 2007. The speedy machines of Project Jim wowed the world, but Venter’s sequence once again proved more accurate and useful for most research.
(The jockeying for status hasn’t ended, either. Venter remains active in research, as he’s currently trying to determine [by subtracting DNA from microbes, gene by gene] the minimum genome necessary for life. And however tacky the action might have seemed, publishing his individual genome might have put him in the catbird seat for the Nobel Prize—an honor that, according to the scuttlebutt that scientists indulge in over late-night suds, he covets. A Nobel can be split among three people at most, but Venter, Collins, Sulston, Watson, and others could all make legitimate claims for one. The Swedish Nobel committee would have to overlook Venter’s lack of decorum, but if it awards him a solo Nobel for his consistently excellent work, Venter can claim he won the genome war after all.*)
So what did all the HGP competition earn us, science-wise? Depends on whom you ask.
Most human geneticists aim to cure diseases, and they felt certain that the HGP would reveal which genes to target for heart disease, diabetes, and other widespread problems. Congress in fact spent $3 billion largely on this implicit promise. But as Venter and others have pointed out, virtually no genetic-based cures have emerged since 2000; virtually none appear imminent, either. Even Collins has swallowed hard and acknowledged, as diplomatically as possible, that the pace of discoveries has frustrated everyone. It turns out that many common diseases have more than a few mutated genes associated with them, and it’s nigh impossible to design a drug that targets more than a few genes. Worse, scientists can’t always pick out the significant mutations from the harmless ones. And in some cases, scientists can’t find mutations to target at all. Based on inheritance patterns, they know that certain common diseases must have significant genetic components—and yet, when scientists scour the genes of victims of those diseases, they find few if any shared genetic flaws. The “culprit
DNA” has gone missing.
There are a few possible reasons for these setbacks. Perhaps the real disease culprits lie in noncoding DNA that lies outside of genes, in regions scientists understand only vaguely. Perhaps the same mutation leads to different diseases in different people because of interactions with their other, different genes. Perhaps the odd fact that some people have duplicate copies of some genes is somehow critically important. Perhaps sequencing, which blasts chromosomes into bits, destroys crucial information about chromosome structure and architectural variation that could tell scientists what genes work together and how. Most scary of all—because it highlights our fundamental ignorance—perhaps the idea of a common, singular “disease” is illusory. When doctors see similar symptoms in different people—fluctuating blood sugar, joint pain, high cholesterol—they naturally assume similar causes. But regulating blood sugar or cholesterol requires scores of genes to work together, and a mutation in any one gene in the cascade could disrupt the whole system. In other words, even if the large-scale symptoms are identical, the underlying genetic causes—what doctors need to pinpoint and treat—might be different. (Some scientists misquote Tolstoy to make this point: perhaps all healthy bodies resemble each other, while each unhealthy body is unhealthy in its own way.) For these reasons, some medical scientists have mumbled that the HGP has—kinda, sorta, so far—flopped. If so, maybe the best “big science” comparison isn’t the Manhattan Project but the Apollo space program, which got man to the moon but fizzled afterward.
Then again, whatever the shortcomings (so far) in medicine, sequencing the human genome has had trickle-down effects that have reinvigorated, if not reinvented, virtually every other field of biology. Sequencing DNA led to more precise molecular clocks, and revealed that animals harbor huge stretches of viral DNA. Sequencing helped scientists reconstruct the origins and evolution of hundreds of branches of life, including those of our primate relatives. Sequencing helped trace the global migration of humans and showed how close we came to extinction. Sequencing confirmed how few genes humans have (the lowest guess, 25,947, won the gene sweepstakes), and forced scientists to realize that the exceptional qualities of human beings derive not so much from having special DNA as from regulating and splicing DNA in special ways.
Finally, having a full human genome—and especially having the individual genomes of Watson and Venter—emphasized a point that many scientists had lost sight of in the rush to sequence: the difference between reading a genome and understanding it. Both men risked a lot by publishing their genomes. Scientists across the world pored over them letter by letter, looking for flaws or embarrassing revelations, and each man had different attitudes about this risk. The apoE gene enhances our ability to eat meat but also (in some versions) multiplies the risk for Alzheimer’s disease. Watson’s grandmother succumbed to Alzheimer’s years ago, and the prospect of losing his own mind was too much to bear, so he requested that scientists not reveal which apoE gene he had. (Unfortunately, the scientists he trusted to conceal these results didn’t succeed.*) Venter blocked nothing about his genome and even made private medical records available. This way, scientists could correlate his genes with his height and weight and various aspects of his health—information that, in combination, is much more medically useful than genomic data alone. It turns out that Venter has genes that incline him toward alcoholism, blindness, heart disease, and Alzheimer’s, among other ailments. (More strangely, Venter also has long stretches of DNA not normally found in humans but common in chimps. No one knows why, but no doubt some of Venter’s enemies have suspicions.) In addition, a comparison between Venter’s genome and the Platonic HGP genome revealed far more deviations than anyone expected—four million mutations, inversions, insertions, deletions, and other quirks, any of which might have been fatal. Yet Venter, now approaching seventy years old, has skirted these health problems. Similarly, scientists have noted two places in Watson’s genome with two copies of devastating recessive mutations—for Usher syndrome (which leaves victims deaf and blind), and for Cockayne syndrome (which stunts growth and prematurely ages people). Yet Watson, well over eighty, has never shown any hint of these problems.
So what gives? Did Watson’s and Venter’s genomes lie to us? What’s wrong with our reading of them? We have no reason to think Watson and Venter are special, either. A naive perusal of anybody’s genome would probably sentence him to sicknesses, deformities, and a quick death. Yet most of us escape. It seems that, however powerful, the A-C-G-T sequence can be circumscribed by extragenetic factors—including our epigenetics.
15
Easy Come, Easy Go?
How Come Identical Twins Aren’t Identical?
The prefix epi- implies something piggybacking on something else. Epiphyte plants grow on other plants. Epitaphs and epigraphs appear on gravestones and in portentous books. Green things like grass happen to reflect light waves at 550 nm (phenomenon), yet our brains register that light as a color, something laden with memory and emotion (epiphenomenon). When the Human Genome Project left scientists knowing almost less than before in some ways—how could twenty-two thousand measly genes, fewer than some grapes have, create complex human beings?—geneticists renewed their emphasis on gene regulation and gene-environment interactions, including epigenetics.
Like genetics, epigenetics involves passing along certain biological traits. But unlike genetic changes, epigenetic changes don’t alter the hardwired A-C-G-T sequence. Instead epigenetic inheritance affects how cells access, read, and use DNA. (You can think about DNA genes as hardware, epigenetics as software.) And while biology often distinguishes between environment (nurture) and genes (nature), epigenetics combines nature with nurture in novel ways. Epigenetics even hints that we can sometimes inherit the nurture part—that is, inherit biological memories of what our mothers and fathers (or grandmothers and grandfathers) ate and breathed and endured.
Frankly, it’s tricky to sort out true epigenetics (or “soft inheritance”) from other gene-environment interactions. It doesn’t help that epigenetics has traditionally been a grab bag of ideas, the place scientists toss every funny inheritance pattern they discover. On top of everything else, epigenetics has a cursed history, littered with starvation, disease, and suicide. But no other field holds such promise for achieving the ultimate goal of human biology: making the leap from HGP molecular minutiae to understanding the quirks and individuality of full-scale human beings.
Though advanced science, epigenetics actually revives an ancient debate in biology, with combatants who predate Darwin—the Frenchman Jean-Baptiste Lamarck and his countryman, our old friend Baron Cuvier.
Just as Darwin made his name studying obscure species (barnacles), Lamarck cut his teeth on vermes. Vermes translates to “worms” but in those days included jellyfish, leeches, slugs, octopuses, and other slippery things that naturalists didn’t stoop to classify. Lamarck, more discriminating and sensitive than his colleagues, rescued these critters from taxonomic obscurity by highlighting their unique traits and separating them into distinct phyla. He soon invented the term invertebrates for this miscellanea, and in 1800 went one better and coined the word biology for his whole field of study.
Lamarck became a biologist through roundabout means. The moment his pushy father died, Lamarck dropped out of seminary school, bought a creaky horse, and, just seventeen years old, galloped away to join the Seven Years’ War. His daughter later claimed that Lamarck distinguished himself there, earning a battlefield promotion to officer, though she often exaggerated his achievements. Regardless, Lt. Lamarck’s career ended ignominiously when his men, playing some sort of game that involved lifting Lamarck by his head, injured him. The military’s loss was biology’s gain, and he soon became a renowned botanist and vermologist.
Not content with dissecting worms, Lamarck devised a grandiloquent theory—the first scientific one—about evolution. The theory had two parts. The overarching bit explained why evolution happened period: all creatures
, he argued, had “inner urges” to “perfect” themselves by becoming more complex, more like mammals. The second half dealt with the mechanics of evolution, how it occurred. And this was the part that overlaps, conceptually at least, with modern epigenetics, because Lamarck said that creatures changed shape or behavior in response to their environment, then passed those acquired traits on.
Jean-Baptiste Lamarck devised perhaps the first scientific theory of evolution. Though mistaken, his theory resembles in some ways the modern science of epigenetics. (Louis-Léopold de Boilly)
For instance, Lamarck suggested that wading shorebirds, straining to keep their derrieres dry, stretched their legs microscopically farther each day and eventually acquired longer legs, which baby birds inherited. Similarly, giraffes reaching to the tippity-tops of trees for leaves acquired long necks and passed those on. It supposedly worked in humans, too: blacksmiths, after swinging hammers year after year, passed their impressive musculature down to their children. Note that Lamarck didn’t say that creatures born with longer appendages or faster feet or whatever had an advantage; instead creatures worked to develop those traits. And the harder they worked, the better the endowment they passed to their children. (Shades of Weber there, and the Protestant work ethic.) Never a modest man, Lamarck announced the “perfecting” of his theory around 1820.