Book Read Free

The Violinist's Thumb: And Other Lost Tales of Love, War, and Genius, as Written by Our Genetic Code

Page 30

by Sam Kean


  After two decades of exploring these grand metaphysical notions about life in the abstract, Lamarck’s actual physical life began unraveling. His academic position had always been precarious, since his theory of acquired traits had never impressed some colleagues. (One strong, if glib, refutation was that Jewish boys still needed circumcising after three millennia of snip-snipping.) He’d also slowly been going blind, and not long after 1820, he had to retire as professor of “insects, worms, and microscopic animals.” Lacking both fame and income, he soon became a pauper, wholly reliant on his daughter’s care. When he died in 1829, he could only afford a “rented grave”—meaning his vermes-eaten remains got just five years’ rest before they were dumped in the Paris catacombs to make room for a new client.

  But a bigger posthumous insult awaited Lamarck, courtesy of the baron. Cuvier and Lamarck had actually collaborated when they’d first met in postrevolutionary Paris, if not as friends then as friendly colleagues. Temperamentally, though, Cuvier was 179 degrees opposed to Lamarck. Cuvier wanted facts, facts, facts, and distrusted anything that smacked of speculation—basically all of Lamarck’s late work. Cuvier also rejected evolution outright. His patron Napoleon had conquered Egypt and lugged back tons of scientific booty, including murals of animals and mummies of cats, crocodiles, monkeys, and other beasts. Cuvier dismissed evolution because these species clearly hadn’t changed in thousands of years, which seemed a good fraction of the earth’s lifetime back then.

  Rather than limit himself to scientific refutations, Cuvier also used his political power to discredit Lamarck. As one of the many hats he wore, Cuvier composed eulogies for the French scientific academy, and he engineered these éloges to ever-so-subtly undermine his deceased colleagues. Poisoning with faint praise, he opened Lamarck’s obit by lauding his late colleague’s dedication to vermin. Still, honesty compelled Cuvier to point out the many, many times his dear friend Jean-Baptiste had strayed into bootless speculation about evolution. Baron Cuvier also turned Lamarck’s undeniable gift for analogies against him, and sprinkled the essay with caricatures of elastic giraffes and damp pelican bums, which became indelibly linked to Lamarck’s name. “A system resting on such foundations may amuse the imagination of a poet,” Cuvier summed up, “but it cannot for a moment bear the examination of anyone who has dissected the hand, the viscera, or even a feather.” Overall, the “eulogy” deserves the title of “cruel masterpiece” that science historian Stephen Jay Gould bestowed. But all morality aside, you do have to hand it to the baron here. To most men, writing eulogies would have been little more than a pain in the neck. Cuvier saw that he could parlay this small burden into a great power, and had the savviness to pull it off.

  After Cuvier’s takedown, a few romantic scientists clung to Lamarckian visions of environmental plasticity, while others, like Mendel, found Lamarck’s theories wanting. Many, though, had trouble making up their minds. Darwin acknowledged in print that Lamarck had proposed a theory of evolution first, calling him a “justly celebrated naturalist.” And Darwin did believe that some acquired characteristics (including, rarely, circumcised penises) could be passed down to future generations. At the same time, Darwin dismissed Lamarck’s theory in letters to friends as “veritable rubbish” and “extremely poor: I got not fact or idea from it.”

  One of Darwin’s beefs was his belief that creatures gained advantages mostly through inherent traits, traits fixed at birth, not Lamarck’s acquired traits. Darwin also emphasized the excruciating pace of evolution, how long everything took, because inborn traits could spread only when the creatures with advantages reproduced. In contrast, Lamarck’s creatures took control of their evolution, and long limbs or big muscles spread everywhere lickety-split, in one generation. Perhaps worst, to Darwin and everyone else, Lamarck promoted exactly the sort of empty teleology—mystical notions of animals perfecting and fulfilling themselves through evolution—that biologists wanted to banish from their field forever.*

  Equally damning to Lamarck, the generation after Darwin discovered that the body draws a strict line of demarcation between normal cells and sperm and egg cells. So even if a blacksmith had the tris, pecs, and delts of Atlas himself, it doesn’t mean squat. Sperm are independent of muscle cells, and if the blacksmith’s sperms are 98-milligram weaklings DNA-wise, so too might his children be weaklings. In the 1950s, scientists reinforced this idea of independence by proving that body cells can’t alter the DNA in sperm or egg cells, the only DNA that matters for inheritance. Lamarck seemed dead forever.

  In the past few decades, though, the vermes have turned. Scientists now see inheritance as more fluid, and the barriers between genes and the environment as more porous. It’s not all about genes anymore; it’s about expressing genes, or turning them on and off. Cells commonly turn DNA off by dotting it with small bumps called methyl groups, or turn DNA on by using acetyl groups to uncoil it from protein spools. And scientists now know that cells pass those precise patterns of methyls and acetyls on to daughter cells whenever they divide—a sort of “cellular memory.” (Indeed, scientists once thought that the methyls in neurons physically recorded memories in our brains. That’s not right, but interfering with methyls and acetyls can interfere with forming memories.) The key point is that these patterns, while mostly stable, are not permanent: certain environmental experiences can add or subtract methyls and acetyls, changing those patterns. In effect this etches a memory of what the organism was doing or experiencing into its cells—a crucial first step for any Lamarck-like inheritance.

  Unfortunately, bad experiences can be etched into cells as easily as good experiences. Intense emotional pain can sometimes flood the mammal brain with neurochemicals that tack methyl groups where they shouldn’t be. Mice that are (however contradictory this sounds) bullied by other mice when they’re pups often have these funny methyl patterns in their brains. As do baby mice (both foster and biological) raised by neglectful mothers, mothers who refuse to lick and cuddle and nurse. These neglected mice fall apart in stressful situations as adults, and their meltdowns can’t be the result of poor genes, since biological and foster children end up equally histrionic. Instead the aberrant methyl patterns were imprinted early on, and as neurons kept dividing and the brain kept growing, these patterns perpetuated themselves. The events of September 11, 2001, might have scarred the brains of unborn humans in similar ways. Some pregnant women in Manhattan developed post-traumatic stress disorder, which can epigenetically activate and deactivate at least a dozen genes, including brain genes. These women, especially the ones affected during the third trimester, ended up having children who felt more anxiety and acute distress than other children when confronted with strange stimuli.

  Notice that these DNA changes aren’t genetic, because the A-C-G-T string remains the same throughout. But epigenetic changes are de facto mutations; genes might as well not function. And just like mutations, epigenetic changes live on in cells and their descendants. Indeed, each of us accumulates more and more unique epigenetic changes as we age. This explains why the personalities and even physiognomies of identical twins, despite identical DNA, grow more distinct each year. It also means that that detective-story trope of one twin committing a murder and both getting away with it—because DNA tests can’t tell them apart—might not hold up forever. Their epigenomes could condemn them.

  Of course, all this evidence proves only that body cells can record environmental cues and pass them on to other body cells, a limited form of inheritance. Normally when sperm and egg unite, embryos erase this epigenetic information—allowing you to become you, unencumbered by what your parents did. But other evidence suggests that some epigenetic changes, through mistakes or subterfuge, sometimes get smuggled along to new generations of pups, cubs, chicks, or children—close enough to bona fide Lamarckism to make Cuvier and Darwin grind their molars.

  The first time scientists caught this epigenetic smuggling in action was in Överkalix, a farming hamlet in the armpit between Swed
en and Finland. It was a tough place to grow up during the 1800s. Seventy percent of households there had five or more children—a quarter, ten or more—and all those mouths generally had to be fed from two acres of poor soil, which was all most families could scrape together. It didn’t help that the weather above sixty-six degrees north latitude laid waste to their corn and other crops every fifth year or so. During some stretches, like the 1830s, the crops died almost every year. The local pastor recorded these facts in the annals of Överkalix with almost lunatic fortitude. “Nothing exceptional to remark,” he once observed, “but that the eighth [consecutive] year of crop failure occurred.”

  Not every year was wretched, naturally. Sporadically, the land blessed people with an abundance of food, and even families of fifteen could gorge themselves and forget the scarce times. But during those darkest winters, when the corn had withered and the dense Scandinavian forests and frozen Baltic Sea prevented emergency supplies from reaching Överkalix, people slit the throats of hogs and cows and just held on.

  This history—fairly typical on the frontier—would probably have gone unremarked except for a few modern Swedish scientists. They got interested in Överkalix because they wanted to sort out whether environmental factors, like a dearth of food, can predispose a pregnant woman’s child to long-term health problems. The scientists had reason to think so, based on a separate study of 1,800 children born during and just after a famine in German-occupied Holland—the Hongerwinter of 1944–45. Harsh winter weather froze the canals for cargo ships that season, and as the last of many favors to Holland, the Nazis destroyed bridges and roads that could have brought relief via land. The daily ration for Dutch adults fell to five hundred calories by early spring 1945. Some farmers and refugees (including Audrey Hepburn and her family, trapped in Holland during the war) took to gnawing tulip bulbs.

  After liberation in May 1945, the ration jumped to two thousand calories, and this jump set up a natural experiment: scientists could compare fetuses who gestated during the famine to fetuses who gestated afterward, and see who was healthier. Predictably, the starved fetuses were generally smaller and frailer babies at birth, but in later years they also had higher rates of schizophrenia, obesity, and diabetes. Because the babies came from the same basic gene pool, the differences probably arose from epigenetic programming: a lack of food altered the chemistry of the womb (the baby’s environment) and thereby altered the expression of certain genes. Even sixty years later, the epigenomes of those who’d starved prenatally looked markedly different, and victims of other modern famines—the siege of Leningrad, the Biafra crisis in Nigeria, the Great Leap Forward in Mao’s China—showed similar long-term effects.

  But because famines had happened so often in Överkalix, the Swedish scientists realized they had an opportunity to study something even more intriguing: whether epigenetic effects could persist through multiple generations. Kings of Sweden had long demanded crop records from every parish (to prevent anyone from cheating on fealties), so agricultural data existed for Överkalix from well before 1800. Scientists could then match the data with the meticulous birth, death, and health records the local Lutheran church kept. As a bonus, Överkalix had very little genetic influx or outflow. The risk of frostbite and a garish local accent kept most Swedes and Lapps from moving there, and of the 320 people the scientists traced, just nine abandoned Överkalix for greener pastures, so scientists could follow families for years and years.

  Some of what the Swedish team uncovered—like a link between maternal nutrition and a child’s future health—made sense. Much of it didn’t. Most notably, they discovered a robust link between a child’s future health and a father’s diet. A father obviously doesn’t carry babies to term, so any effect must have slipped in through his sperm. Even more strangely, the child got a health boost only if the father faced starvation. If the father gorged himself, his children lived shorter lives with more diseases.

  The influence of the fathers turned out to be so strong that scientists could trace it back to the father’s father, too—if grandpa Harald starved, baby grandson Olaf would benefit. These weren’t subtle effects, either. If Harald binged, Olaf’s risk of diabetes increased fourfold. If Harald tightened his belt, Olaf lived (after adjusting for social disparities) an average of thirty years longer. Remarkably, this was a far greater effect than starvation or gluttony had on Grandpa himself: grandpas who starved, grandpas who gorged, and grandpas who ate just right all lived to the same age, seventy years.

  This father/grandfather influence didn’t make any genetic sense; famine couldn’t have changed the parent’s or child’s DNA sequence, since that was set at birth. The environment wasn’t the culprit, either. The men who starved ended up marrying and reproducing in all different years, so their children and grandchildren grew up in different decades in Överkalix, some good, some bad—yet all benefited, as long as Dad or his dad had done without.

  But the influence might make epigenetic sense. Again, food is rich in acetyls and methyls that can flick genes on and off, so bingeing or starving can mask or unmask DNA that regulates metabolism. As for how these epigenetic switches got smuggled between generations, scientists found a clue in the timing of the starvation. Starving during puberty, during infancy, during peak fertility years—none of that mattered for the health of a man’s child or grandchild. All that mattered was whether he binged or starved during his “slow growth period,” a window from about nine to twelve years old, right before puberty. During this phase, males begin setting aside a stock of cells that will become sperm. So if the slow growth period coincided with a feast or famine, the pre-sperm might be imprinted with unusual methyl or acetyl patterns, patterns that would get imprinted on actual sperm in time.

  Scientists are still working out the molecular details of what must have happened at Överkalix. But a handful of other studies about soft paternal inheritance in humans supports the idea that sperm epigenetics has profound and inheritable effects. Men who take up smoking before eleven years old will have tubbier children, especially tubbier boys, than men who start smoking later, even if the grade-school smokers snuff the habit sooner. Similarly, the hundreds of millions of men in Asia and Africa who chew the pulp of betel nuts—a cappuccino-strength stimulant—have children with twice the risk of heart disease and metabolic ailments. And while neuroscientists cannot always find anatomical differences between healthy brains and brains addled with psychoses, they have detected different methyl patterns in the brains of schizophrenics and manic-depressives, as well as in their sperm. These results have forced scientists to revise their assumption that a zygote wipes clean all the environmental tarnish of sperm (and egg) cells. It seems that, Yahweh-like, the biological flaws of the fathers can be visited unto their children, and their children’s children.

  The primacy of sperm in determining a child’s long-term health is probably the most curious aspect of the whole soft inheritance business. Folk wisdom held that maternal impressions, like exposure to one-armed men, was devastating; modern science says paternal impressions count as much or more. Still, these parent-specific effects weren’t wholly unexpected, since scientists already knew that maternal and paternal DNA don’t quite contribute equally to children. If male lions mount female tigers, they produce a liger—a twelve-foot cat twice as heavy as your average king of the jungle. But if a male tiger knocks up a lion, the resulting tiglon isn’t nearly as hefty. (Other mammals show similar discrepancies. Which means that Ilya Ivanov’s attempts to impregnate female chimpanzees and female humans weren’t as symmetrical as he’d hoped.) Sometimes maternal and paternal DNA even engage in outright combat for control of the fetus. Take the igf gene (please).

  For once, spelling out a gene’s name helps make sense of it: igf stands for “insulin-like growth factor,” and it makes children in the womb hit their size milestones way earlier than normal. But while fathers want both of a child’s igf genes blazing away, to produce a big, hale baby that will grow up fast and pass
its genes on early and often, mothers want to temper the igfs so that baby number one doesn’t crush her insides or kill her in labor before she has other children. So, like an elderly couple fighting over the thermostat, sperm tend to snap their igf into the on position, while eggs snap theirs off.

  Hundreds of other “imprinted” genes turn off or on inside us, too, based on which parent bestowed them. In Craig Venter’s genome, 40 percent of his genes displayed maternal/paternal differences. And deleting the exact same stretch of DNA can lead to different diseases, depending on whether Mom’s or Dad’s chromosome is deficient. Some imprinted genes even switch allegiance over time: in mice (and presumably in humans) maternal genes maintain control over brains as children, while paternal genes take over later in life. In fact, we probably can’t survive without proper “epigender” imprinting. Scientists can easily engineer mice embryos with two sets of male chromosomes or two sets of female chromosomes, and according to traditional genetics, this shouldn’t be a big deal. But these double-gendered embryos expire in the womb. When scientists mixed in a few cells from the opposite sex to help the embryos survive, the males2 became huge Botero babies (thanks to igf) but had puny brains. Females2 had small bodies but oversized brains. Variations, then, between the brain sizes of Einstein and Cuvier might be nothing but a quirk of their parents’ bloodlines, like male pattern baldness.

  So-called parent-of-origin effects have also revived interest in one of the most egregious scientific frauds ever perpetrated. Given the subtlety of epigenetics—scientists have barely gotten a handle in the past twenty years—you can imagine that a scientist stumbling across these patterns long ago would have struggled to interpret his results, much less convince his colleagues of them. And Austrian biologist Paul Kammerer did struggle, in science and love and politics and everything else. But a few epigeneticists today see his story as maybe, just maybe, a poignant reminder about the peril of making a discovery ahead of its time.

 

‹ Prev