Book Read Free

The Violinist's Thumb: And Other Lost Tales of Love, War, and Genius, as Written by Our Genetic Code

Page 20

by Sam Kean


  Even less forgivable, Buckland pretty much dropped a steaming coprolite on one of the most spectacular archaeological discoveries ever. In 1829 Philippe-Charles Schmerling unearthed, among some ancient animal remains, a few uncanny, human-but-not-quite-human bones in Belgium. Basing his conclusions especially on skull fragments from a child, he suggested they belonged to an extinct hominid species. Buckland examined the bones in 1835 at a scientific meeting but never removed his biblical blinders. He rejected Schmerling’s theory and, instead of saying so quietly, proceeded to humiliate him. Buckland often claimed that because of various chemical changes, fossilized bones naturally stick to the tongue, whereas fresher bones don’t. During a lecture at the meeting, Buckland placed onto his tongue one of the animal bones (a bear’s) that Schmerling had found mixed in with the hominid remains. The bear bone stuck fast, and Buckland continued to lecture, the bone flopping about hilariously. He then challenged Schmerling to stick his “extinct human” bones to his own tongue. They fell off. Ergo they weren’t ancient.

  Though hardly definitive proof, the dismissal lingered in the minds of paleontologists. So when more uncanny skulls turned up in Gibraltar in 1848, prudent scientists ignored them. Eight years later—and just months after the death of Buckland, the last great Deluge scientist—miners worked loose still more of the odd bones from a limestone quarry in Germany’s Neander Valley. One scholar, channeling Buckland, identified them as belonging to a deformed Cossack who had been injured by Napoleon’s army and crawled into a cliff-side cave to die. But this time two other scientists reasserted that the remains belonged to a distinct line of hominids, a race more outcast than the biblical Ishmaelites. Perhaps it helped that, among the various bones, the duo had located an adult skullcap down to the eye sockets, which emphasized the thick, glowering brow we still associate with Neanderthals.*

  With their eyes opened—and with the publication in 1859 of a little book by Charles Darwin—paleontologists began to find Neanderthals and related hominids across Africa, the Middle East, and Europe. The existence of ancient humans became a scientific fact. But just as predictably, the new evidence provoked new confusion. Skeletons can shift in the ground as rock formations buckle, fouling up attempts to date or interpret them. Bones also scatter and get crushed into smithereens, forcing scientists to rebuild entire creatures from a few stray molars or metatarsals—a subjective process open to dissension and differing interpretations. There’s no guarantee either that scientists will find representative samples: if scientists in AD 1,000,000 discovered what remains of Wilt Chamberlain, Tom Thumb, and Joseph Merrick, would they even classify them as the same species? For these reasons, every new discovery of Homo this and Homo that in the 1800s and 1900s incited further and often nasty debate. And decade after decade passed without the ultimate questions (Were all archaic humanoids our ancestors? If not, how many twigs of humanity existed?) becoming clearer. As the old joke went, put twenty paleontologists into a room, and you’d get twenty-one different schemes for human evolution. One world expert in archaic human genetics, Svante Pääbo, has noted, “I’m often rather surprised about how much scientists fight in paleontology…. I suppose the reason is that paleontology is a rather data-poor science. There are probably more paleontologists than there are important fossils in the world.”

  Such was the general state of things when genetics invaded paleontology and archaeology beginning in the 1960s—and invaded is the apt word. Despite their quarrels, U-turns, and antiquated tools, paleontologists and archaeologists had figured out a lot about human origins. They didn’t need a savior, thanks. So many of them resented the intrusion of biologists with their DNA clocks and molecular-based family trees, hotshots intent on overturning decades of research with a single paper. (One anthropologist scoffed at the strictly molecular approach as “no muss, no fuss, no dishpan hands. Just throw some proteins into a laboratory apparatus, shake them up, and bingo!—we have an answer to questions that have puzzled us for three generations.”) And really, the older scientists’ skepticism was warranted: paleogenetics turned out to be beastly hard, and despite their promising ideas, paleogeneticists had to spend years proving their worth.

  One problem with paleogenetics is that DNA is thermodynamically unstable. Over time, C chemically degrades into T, and G degrades into A, so paleogeneticists can’t always believe what they read in ancient samples. What’s more, even in the coldest climates, DNA breaks down into gibberish after 100,000 years; samples older than that harbor virtually no intact DNA. Even in relatively fresh samples, scientists might find themselves piecing together a billion-base-pair genome from fragments just fifty letters long—proportionally equivalent to reconstructing your typical hardcover from strokes, loops, serifs, and other fragments smaller than the tittle on an i.

  Oh, and most of those fragments are junk. No matter where a corpse falls—the coldest polar ice cap, the driest Saharan dune—bacteria and fungi will worm inside and smear their own DNA around. Some ancient bones contain more than 99 percent foreign DNA, all of which must be laboriously extracted. And that’s the easy kind of contamination to deal with. DNA spreads so easily from human contact (even touching or breathing on a sample can pollute it), and ancient hominid DNA so closely mirrors our own, that ruling out human contamination in samples is almost impossible.

  These obstacles (plus a few embarrassing retractions over the years) have pushed paleogeneticists into near paranoia about contamination, and they demand controls and safeguards that seem a better fit for a biological warfare lab. Paleogeneticists prefer samples no human has ever handled—ideally, ones still dirty from remote dig sites, where workers use surgical masks and gloves and drop everything into sterile bags. Hair is the best material, since it absorbs fewer contaminants and can be bleached clean, but paleogeneticists will settle for less fragile bone. (And given the paucity of uncontaminated sites, they often settle for bones in museum storage lockers, especially bones so boring no one ever bothered studying them before.)

  The sample selected, scientists bring it into a “clean room” maintained at higher-than-normal air pressure, so that air currents—and more to the point, the floating scraps of DNA that can ride around on air currents—cannot flow inward when the door opens. Anyone allowed inside the room dresses toe to top in sterile scrubs with face masks and booties and two pairs of gloves, and they get pretty used to the odor of the bleach swabbed over most surfaces. (One lab bragged that its technicians, presumably while in their suits, are given sponge baths in bleach.) If the sample is bone, scientists use dentist drills or picks to shave off a few grams of powder. They might even doctor the drill so it rotates only at 100 rpm, since the heat of a standard, 1,000-rpm drill can fry DNA. They then dissolve the nib of powder with chemicals, which liberates the DNA. At this point paleogeneticists often add tags—snippets of artificial DNA—to every fragment. That way they can tell if extraneous DNA, which lacks the tag,* ever infiltrates the sample after it leaves the clean room. Scientists might also note the racial backgrounds of technicians and other scientists (and probably even janitors) at the lab, so that if unexpected ethnic sequences show up, they can judge whether their sample was compromised.

  After all this preparation, the actual DNA sequencing begins. We’ll talk about this process in detail later, but basically scientists determine the A-C-G-T sequence of each individual DNA fragment, then use sophisticated software to piece the many, many fragments together. Paleogeneticists have successfully applied this technique to stuffed quaggas, cave bear skulls, woolly mammoth tufts, bees in amber, mummy skin, even Buckland’s beloved coprolites. But the most spectacular work along these lines comes from Neanderthal DNA. After the discovery of Neanderthals, many scientists classified them as archaic humans—the first (before the metaphor became tired) missing link. Others put Neanderthals on their own terminal branch of evolution, while some European scientists considered Neanderthals the ancestors of some human races but not others. (Again, sigh, you can guess which races
they singled out, Africans and Aborigines.) Regardless of the exact taxonomy, scientists considered Neanderthals thick-witted and lowbrow, and it didn’t surprise anyone that they’d died out. Eventually some dissenters began arguing that Neanderthals showed more smarts than they got credit for: they used stone tools, mastered fire, buried their dead (sometimes with wildflowers), cared for the weak and lame, and possibly wore jewelry and played bone flutes. But the scientists couldn’t prove that Neanderthals hadn’t watched humans do these things first and aped them, which hardly takes supreme intelligence.

  DNA, though, permanently changed our view of Neanderthals. As early as 1987, mitochondrial DNA showed that Neanderthals weren’t direct human ancestors. At the same time, when the complete Neanderthal genome appeared in 2010, it turned out that the butt of so many Far Side cartoons was pretty darn human anyway; we share well north of 99 percent of our genome with them. In some cases this overlap was homely: Neanderthals likely had reddish hair and pale skin; they had the most common blood type worldwide, O; and like most humans, they couldn’t digest milk as adults. Other findings were more profound. Neanderthals had similar MHC immunity genes, and also shared a gene, foxp2, associated with language skills, which means they may have been articulate.

  It’s not clear yet whether Neanderthals had alternative versions of apoE, but they got more of their protein from flesh than we did and so probably had some genetic adaptations to metabolize cholesterol and fight infections. Indeed, archaeological evidence suggests that Neanderthals didn’t hesitate to eat even their own dead—perhaps as part of primitive shamanic rituals, perhaps for darker reasons. At a cave in northern Spain, scientists have discovered the fifty-thousand-year-old remains of twelve murdered Neanderthal adults and children, many of them related. After the deed, their probably starving assailants butchered them with stone tools and cracked their bones to suck the marrow, cannibalizing every edible ounce. A gruesome scene, but it was from this surfeit of 1,700 bones that scientists extracted much of their early knowledge of Neanderthal DNA.

  Like it or not, similar evidence exists for human cannibalism. Each hundred-pound adult, after all, could provide starving comrades with forty pounds of precious muscle protein, plus edible fat, gristle, liver, and blood. More uncomfortably, archaeological evidence has long suggested that humans tucked into each other even when not famished. But for years questions persisted about whether most nonstarvation cannibalism was religiously motivated and selective or culinary and routine. DNA suggests routine. Every known ethnic group worldwide has one of two genetic signatures that help our bodies fight off certain diseases that cannibals catch, especially mad-cow-like diseases that come from eating each other’s brains. This defensive DNA almost certainly wouldn’t have become fixed worldwide if it hadn’t once been all too necessary.

  As the cannibalism DNA shows, scientists don’t rely entirely on ancient artifacts for information about our past. Modern human DNA holds clues as well. And about the first thing scientists noticed when they began surveying modern human DNA is its lack of variety. Roughly 150,000 chimps and around the same number of gorillas are living today, compared to some seven billion humans. Yet humans have less genetic diversity than these monkeys, significantly less. This suggests that the worldwide population of humans has dipped far below the population of chimps and gorillas recently, perhaps multiple times. Had the Endangered Species Act existed way back when, Homo sapiens might have been the Paleolithic equivalent of pandas and condors.

  Scientists disagree on why our population decreased so much, but the origins of the debate trace back to two different theories—or really, two different weltanschauungs—first articulated in William Buckland’s day. Virtually every scientist before then upheld a catastrophist view of history—that floods, earthquakes, and other cataclysms had sculpted the planet quickly, throwing up mountains over a long weekend and wiping out species overnight. A younger generation—especially Buckland’s student Charles Lyell, a geologist—pushed gradualism, the idea that winds, tides, erosion, and other gentle forces shaped the earth and its inhabitants achingly slowly. For various reasons (including some posthumous smear campaigns), gradualism became associated with proper science, catastrophism with lazy reasoning and theatrical biblical miracles, and by the early 1900s catastrophism itself had been (and this puts it mildly) annihilated in science. Eventually the pendulum swung back, and catastrophism became respectable again after 1979, when geologists discovered that a city-sized asteroid or comet helped eradicate the dinosaurs. Since then, scientists have accepted that they can uphold a proper gradualist view for most of history and still allow that some pretty apocalyptic events have taken place. But this acceptance makes it all the more curious that one ancient calamity, the first traces of which were discovered within a year of the dino impact, has received far less attention. Especially considering that some scientists argue that the Toba supervolcano almost eliminated a species far more dear to us than dinosaurs: Homo sapiens.

  Getting a grasp on Toba takes some imagination. Toba is—or was, before the top 650 cubic miles blew off—a mountain in Indonesia that erupted seventy-odd thousand years ago. But because no witnesses survived, we can best appreciate its terror by comparing it (however faintly) to the second-largest known eruption in that archipelago, the Tambora eruption of 1815.

  In early April 1815, three pillars of fire straight out of Exodus blasted out of Tambora’s top. Tens of thousands died as psychedelic orange lava surfed down the mountainside, and a tsunami five feet high and traveling at 150 miles per hour battered nearby islands. People fifteen hundred miles away (roughly from New York to mid–South Dakota) heard the initial blast, and the world went black for hundreds of miles around as a smoke plume climbed ten miles into the sky. That smoke carried with it enormous amounts of sulfurous chemicals. At first, these aerosols seemed harmless, even pleasant: in England, they intensified the pinks, oranges, and bloody reds of the sunsets that summer, a celestial drama that probably influenced the land- and sunscapes of painter J. M. W. Turner. Later effects were less cute. By 1816—popularly known as The Year Without a Summer—the sulfurous ejecta had mixed homogeneously into the upper atmosphere and began reflecting sunlight back into space. This loss of heat caused freak July and August snowstorms in the fledgling United States, and crops failed widely (including Thomas Jefferson’s corn at Monticello). In Europe Lord Byron wrote a dire poem in July 1816 called “Darkness,” which opens, “I had a dream, which was not all a dream. / The bright sun was extinguish’d… / Morn came and went—and came, and brought no day, / And men… / Were chill’d into a selfish prayer for light.” A few writers happened to holiday with Byron that summer near Lake Geneva, but they found the days so dreary that they mostly sulked indoors. Channeling their mood, some took to telling ghost stories for entertainment—one of which, by young Mary Shelley, became Frankenstein.

  Now, with all that in mind about Tambora, consider that Toba spewed for five times longer and ejected a dozen times more material—millions of tons of vaporized rock per second* at its peak. And being so much bigger, Toba’s enormous black basilisk of a plume could do proportionately more damage. Because of prevailing winds, most of the plume drifted westward. And some scientists think that a DNA bottleneck began when the smoke, after sweeping across south Asia, scythed into the very grasslands in Africa where humans lived. According to this theory, the destruction happened in two phases. In the short term, Toba dimmed the sun for six years, disrupted seasonal rains, choked off streams, and scattered whole cubic miles of hot ash (imagine wading through a giant ashtray) across acres and acres of plants, a major food source. It’s not hard to imagine the human population plummeting. Other primates might have suffered less at first because humans camped on the eastern edge of Africa, in Toba’s path, whereas most primates lived inland, sheltered somewhat behind mountains. But even if Toba spared other animals initially, no one escaped the second phase. Earth was already mired in an Ice Age in 70,000 BC, and the persistent ref
lection of sunlight into space might well have exacerbated it. We have evidence that the average temperature dropped twenty-plus degrees in some spots, after which the African savannas—our ancient homes—probably contracted like puddles in August heat. Overall, then, the Toba-bottleneck theory argues that the initial eruption led to widespread starvation, but the deepening Ice Age is what really pinned the human population down.

  Macaque, orangutan, tiger, gorilla, and chimpanzee DNA also show some signs of bottlenecking right around Toba, but humans really suffered. One study suggested that the human population, worldwide, might have dropped to forty adults. (The world record for fitting people in a phone booth is twenty-five.) That’s an outlandishly pessimistic guess even among disaster scientists, but it’s common to find estimates of a few thousand adults, below what some minor-league baseball teams draw. Given that these humans might not have been united in one place either, but scattered in small, isolated pockets around Africa, things look even shakier for our future. If the Toba-bottleneck theory is true, then the lack of diversity in human DNA has a simple explanation. We damn near went extinct.

 

‹ Prev