Book Read Free

The Violinist's Thumb: And Other Lost Tales of Love, War, and Genius, as Written by Our Genetic Code

Page 32

by Sam Kean


  Wilson later admitted he’d been politically idiotic not to anticipate the firestorm, maelstrom, hurricane, and plague of locusts that such suggestions would cause among academics. Sure enough, some Harvard colleagues, including the publicly cuddly Stephen Jay Gould, lambasted Sociobiology as an attempt to rationalize racism, sexism, poverty, war, a lack of apple pie, and everything else decent people abhor. They also explicitly linked Wilson with vile eugenics campaigns and Nazi pogroms—then acted surprised when other folks lashed out. In 1978, Wilson was defending his work at a scientific conference when a few half-wit activists stormed onstage. Wilson, in a wheelchair with a broken ankle, couldn’t dodge or fight back, and they wrested away his microphone. After charging him with “genocide,” they poured ice water over his head, and howled, “You’re all wet.”

  By the 1990s, thanks to its dissemination by other scientists (often in softer forms), the idea that human behavior has firm genetic roots hardly seemed shocking. Similarly, we take for granted today another sociobiological tenet, that our hunter-scavenger-gatherer legacy left us with DNA that still biases our thinking. But just as the sociobiology ember was flickering, scientists in Scotland spurted kerosene on the public’s fear of genetics by announcing, in February 1997, the birth of probably the most famous nonhuman animal ever. After transferring adult sheep DNA into four hundred sheep eggs, then zapping them Frankenstein-style with electricity, the scientists managed to produce twenty viable embryos—clones of the adult donor. These clones spent six days in test tubes, then 145 in utero, during which time nineteen spontaneously aborted. Dolly lived.

  In truth, most of the humans gawking at this little lamb cared nothing about Dolly qua Dolly. The Human Genome Project was rumbling along in the background, promising scientists a blueprint of humanity, and Dolly stoked fears that scientists were ramping up to clone one of our own—and with no moratorium in sight. This frankly scared the bejeezus out of most people, although Arthur Caplan did field one excited phone call about the possibility of cloning Jesus himself. (The callers planned to lift DNA from the Shroud of Turin, natch. Caplan remembered thinking, “You are trying to bring back one of the few people that are supposed to come back anyway.”)

  Dolly, the first cloned mammal, undergoes a checkup. (Photo courtesy of the Roslin Institute, University of Edinburgh)

  Dolly’s pen mates accepted her, and didn’t seem to care about her ontological status as a clone. Nor did her lovers—she eventually gave birth to six (naturally begotten) lambs, all strapping. But for whatever reason, human beings fear clones almost instinctively. Post-Dolly, some people hatched sensational supposes about clone armies goose-stepping through foreign capitals, or ranches where people would raise clones to harvest organs. Less outlandishly, some feared that clones would be burdened by disease or deep molecular flaws. Cloning adult DNA requires turning on dormant genes and pushing cells to divide, divide, divide. That sounds a lot like cancer, and clones do seem prone to tumors. Many scientists also concluded (although Dolly’s midwives dispute this) that Dolly was born a genetic geriatric, with unnaturally old and decrepit cells. Arthritis did in fact stiffen Dolly’s legs at a precocious age, and she died at age six (half her breed’s life span) after contracting a virus that, à la Peyton Rous, gave her lung cancer. The adult DNA used to clone Dolly had been—like all adult DNA—pockmarked with epigenetic changes and warped by mutations and poorly patched breaks. Such flaws might have corrupted her genome before she was ever born.*

  But if we’re toying with playing god here, we might as well play devil’s advocate, too. Suppose that scientists overcome all the medical limitations and produce perfectly healthy clones. Many people would still oppose human cloning on principle. Part of their reasoning, however, relies on understandable but thankfully faulty assumptions about genetic determinism, the idea that DNA rigidly dictates our biology and personality. With every new genome that scientists sequence, it becomes clearer that genes deal in probabilities, not certainties. A genetic influence is just that, only that. Just as important, epigenetic research shows that the environment changes how genes work and interact, so cloning someone faithfully might require preserving every epigenetic tag from every missed meal and every cigarette. (Good luck.) Most people forget too that it’s already too late to avoid exposure to human clones; they live among us even now, monstrosities called identical twins. A clone and its parent would be no more alike than twins are with all their epigenetic differences, and there’s reason to believe they’d actually be less alike.

  Consider: Greek philosophers debated the idea of a ship whose hull and decks were gradually rotting, plank by plank; eventually, over the decades, every original scrap of wood got replaced. Was it still the same ship at the end? Why or why not? Human beings present a similar stumper. Atoms in our body get recycled many, many times before death, so we don’t have the same bodies our whole lives. Nevertheless we feel like the same person. Why? Because unlike a ship, each human has an uninterrupted store of thoughts and remembrances. If the human soul exists, that mental memory cache is it. But a clone would have different memories than his parent—would grow up with different music and heroes, be exposed to different foods and chemicals, have a brain wired differently by new technologies. The sum of these differences would be dissimilar tastes and inclinations—leading to a dissimilar temperament and a distinct soul. Cloning would therefore not produce a doppelgänger in anything but literal superficialities. Our DNA does circumscribe us; but where we fall within our range of possibilities—our statures, what diseases we’ll catch, how our brains handle stress or temptation or setbacks—depends on more than DNA.

  Make no mistake, I’m not arguing in favor of cloning here. If anything, this argues against—since what would be the point? Bereaved parents might yearn to clone Junior and ease that ache every time they walked by his empty room, or psychologists might want to clone Ted Kaczynski or Jim Jones and learn how to defuse sociopaths. But if cloning won’t fulfill those demands—and it almost certainly cannot—why bother?

  Cloning not only riles people up over unlikely horrors, it distracts from other controversies about human nature that genetic research can, and has, dredged up. As much as we’d like to close our eyes to these quarrels, they don’t seem likely to vanish.

  Sexual orientation has some genetic basis. Bees, birds, beetles, crabs, fish, skinks, snakes, toads, and mammals of all stripes (bison, lions, raccoons, dolphins, bears, monkeys) happily get frisky with their own sex, and their coupling often seems hardwired. Scientists have discovered that disabling even a single gene in mice—the suggestively named fucM gene—can turn female mice into lesbians. Human sexuality is more nuanced, but gay men (who have been studied more extensively than gay women) have substantially more gay relatives than heterosexual men raised in similar circumstances, and genes seem like one strong differentiator.

  This presents a Darwinian conundrum. Being gay decreases the likelihood of having children and passing on any “gay genes,” yet homosexuality has persisted in every last corner of the globe throughout all of history, despite often-violent persecution. One theory argues that perhaps gay genes are really “man-loving” genes—androphilic DNA that makes men love men but also makes women who have it lust after men, too, increasing their odds of having children. (Vice versa for gynophilic DNA.) Or perhaps homosexuality arises as a side effect of other genetic interactions. Multiple studies have found higher rates of left-handedness and ambidextrousness among gay men, and gay men frequently have longer ring fingers, too. No one really believes that holding a salad fork in one hand or the other causes homosexuality, but some far-reaching gene might influence both traits, perhaps by fiddling with the brain.

  These discoveries are doubled-edged. Finding genetic links would validate being gay as innate and intrinsic, not a deviant “choice.” That said, people already tremble about the possibility of screening for and singling out homosexuals, even potential homosexuals, from a young age. What’s more, these results can
be misrepresented. One strong predictor of homosexuality is the number of older biological brothers someone has; each one increases the odds by 20 to 30 percent. The leading explanation is that a mother’s immune system mounts a progressively stronger response to each “foreign” Y chromosome in her uterus, and this immune response somehow induces homosexuality in the fetal brain. Again, this would ground homosexuality in biology—but you can see how a naive, or malicious, observer could twist this immunity link rhetorically and equate homosexuality with a disease to eradicate. It’s a fraught picture.

  Race also causes a lot of discomfort among geneticists. For one thing, the existence of races makes little sense. Humans have lower genetic diversity than almost any animal, but our colors and proportions and facial features vary as wildly as the finalists each year at Westminster. One theory of race argues that near extinctions isolated pockets of early humans with slight variations, and as these groups migrated beyond Africa and bred with Neanderthals and Denisovans and who knows what else, those variations became exaggerated. Regardless, some DNA must differ between ethnic groups: an aboriginal Australian husband and wife will never themselves produce a freckled, red-haired Seamus, even if they move to the Emerald Isle and breed till doomsday. Color is encoded in DNA.

  The sticking point, obviously, isn’t Maybelline-like variations in skin tone but other potential differences. Bruce Lahn, a geneticist at the University of Chicago, started his career cataloging palindromes and inversions on Y chromosomes, but around 2005 he began studying the brain genes microcephalin and aspm, which influence the growth of neurons. Although multiple versions exist in humans, one version of each gene had numerous hitchhikers and seemed to have swept through our ancestors at about Mach 10. This implied a strong survival advantage, and based on their ability to grow neurons, Lahn took a small leap and argued that these genes gave humans a cognitive boost. Intriguingly, he noted that the brain-boosting versions of microcephalin and aspm started to spread, respectively, around 35,000 BC and 4,000 BC, when, respectively, the first symbolic art and the first cities appeared in history. Hot on the trail, Lahn screened different populations alive today and determined that the brain-boosting versions appeared several times more often among Asians and Caucasians than among native Africans. Gulp.

  Other scientists denounced the findings as speculative, irresponsible, racist, and wrong. These two genes exercise themselves in many places beyond the brain, so they may have aided ancient Europeans and Asians in other ways. The genes seem to help sperm whip their tails faster, for one thing, and might have outfitted the immune system with new weapons. (They’ve also been linked to perfect pitch, as well as tonal languages.) Even more damning, follow-up studies determined that people with these genes scored no better on IQ tests than those without them. This pretty much killed the brain-boosting hypothesis, and Lahn—who, for what it’s worth, is a Chinese immigrant—soon admitted, “On the scientific level, I am a little bit disappointed. But in the context of the social and political controversy, I am a little bit relieved.”

  He wasn’t the only one: race really bifurcates geneticists. Some swear up and down that race doesn’t exist. It’s “biologically meaningless,” they maintain, a social construct. Race is indeed a loaded term, and most geneticists prefer to speak somewhat euphemistically of “ethnic groups” or “populations,” which they confess do exist. But even then some geneticists want to censor investigations into ethnic groups and mental aptitude as inherently wounding—they want a moratorium. Others remain confident that any good study will just prove racial equality, so what the hey, let them continue. (Of course the act of lecturing us about race, even to point out its nonexistence, probably just reinforces the idea. Quick—don’t think of green giraffes.)

  Meanwhile some otherwise very pious scientists think the “biologically meaningless” bit is baloney. For one thing, some ethnic groups respond poorly—for purely biochemical reasons—to certain medications for hepatitis C and heart disease, among other ailments. Other groups, because of meager conditions in their ancient homelands, have become vulnerable to metabolic disorders in modern times of plenty. One controversial theory argues that descendants of people captured in slave raids in Africa have elevated rates of hypertension today in part because ancestors of theirs whose bodies hoarded nutrients, especially salt, more easily survived the awful oceanic voyages to their new homes. A few ethnic groups even have higher immunity to HIV, but each group, again, for different biochemical reasons. In these and other cases—Crohn’s disease, diabetes, breast cancer—doctors and epidemiologists who deny race completely could harm people.

  On a broader level, some scientists argue that races exist because each geographic population has, indisputably, distinct versions of some genes. If you examine even a few hundred snippets of someone’s DNA, you can segregate him into one of a few broad ancestral groups nearly 100 percent of the time. Like it or not, those groups do generally correspond to people’s traditional notion of races—African, Asian, Caucasian (or “swine-pink,” as one anthropologist put it), and so on. True, there’s always genetic bleed-over between ethnic groups, especially at geographic crossroads like India, a fact that renders the concept of race useless—too imprecise—for many scientific studies. But people’s self-identified social race does predict their biological population group pretty well. And because we don’t know what every distinct version of every stretch of DNA does, a few polemical and very stubborn scientists who study races/populations/whatever-you-want-to-call-thems argue that exploring potential differences in intellect is fair game—they resent being censored. Predictably, both those who affirm and those who deny race accuse the other side of letting politics color their science.*

  Beyond race and sexuality, genetics has popped up recently in discussions of crime, gender relations, addiction, obesity, and many other things. Over the next few decades, in fact, genetic factors and susceptibilities will probably emerge for almost every human trait or behavior—take the over on that one. But regardless of what geneticists discover about these traits or behaviors, we should keep a few guidelines in mind when applying genetics to social issues. Most important, no matter the biological underpinnings of a trait, ask yourself if it really makes sense to condemn or dismiss someone based on how a few microscopic genes behave. Also, remember that most of our genetic predilections for behavior were shaped by the African savanna many thousands if not millions of years ago. So while “natural” in some sense, these predilections don’t necessarily serve us well today, since we live in a radically different environment. What happens in nature is a poor guide for making decisions anyway. One of the biggest boners in ethical philosophy is the naturalistic fallacy, which equates nature with “what’s right” and uses “what’s natural” to justify or excuse prejudice. We human beings are humane in part because we can look beyond our biology.

  In any study that touches on social issues, we can at least pause and not draw sensational conclusions without reasonably complete evidence. In the past five years, scientists have conscientiously sought out and sequenced DNA from more and more ethnic groups worldwide, to expand what remains, even today, an overwhelmingly European pool of genomes available to study. And some early results, especially from the self-explanatory 1,000 Genomes Project, indicate that scientists might have overestimated the importance of genetic sweeps—the same sweeps that ignited Lahn’s race-intelligence firecracker.

  By 2010 geneticists had identified two thousand versions of human genes that showed signs of being swept along; specifically, because of low diversity around these genes, it looked as if hitchhiking had taken place. And when scientists looked for what differentiated these swept-along versions from versions not swept along, they found cases where a DNA triplet had mutated and now called for a new amino acid. This made sense: a new amino acid could change the protein, and if that change made someone fitter, natural selection might indeed sweep it through a population. However, when scientists examined other regions, they found th
e same signs of sweeps in genes with silent mutations—mutations that, because of redundancy in the genetic code, didn’t change the amino acid. Natural selection cannot have swept these changes along, because the mutation would be invisible and offer no benefits. In other words, many apparent DNA sweeps could be spurious, artifacts of other evolutionary processes.

  That doesn’t mean that sweeps never happen; scientists still believe that genes for lactose tolerance, hair structure, and a few other traits (including, ironically, skin color) did sweep through various ethnic groups at various points as migrants encountered new environments beyond Africa. But those might represent rare cases. Most human changes spread slowly, and probably no one ethnic group ever “leaped ahead” in a genetic sweepstakes by acquiring blockbuster genes. Any claims to the contrary—especially considering how often supposedly scientific claims about ethnic groups have fallen apart before—should be handled with caution. Because as the old saw says, it’s not what we don’t know that stirs up trouble, it’s what we do know that just ain’t so.

  Becoming wiser in the ways of genetics will require not only advances in understanding how genes work, but advances in computing power. Moore’s Law for computers—which says that microchips get roughly twice as powerful every two years—has held for decades, which explains why some pet collars today could outperform the Apollo mission mainframes. But since 1990 genetic technology has outstripped even Moore’s projections. A modern DNA sequencer can generate more data in twenty-four hours than the Human Genome Project did in ten long years, and the technology has become increasingly convenient, spreading to labs and field stations worldwide. (After killing Osama bin Laden in 2011, U.S. military personnel identified him—by matching his DNA to samples collected from relatives—within hours, in the middle of the ocean, in the dead of the a.m.) Simultaneously, the cost of sequencing an entire genome has gone into vacuum free-fall—from $3,000,000,000 to $10,000, from $1 per base pair to around 0.0003¢. If scientists want to study a single gene nowadays, it’s often cheaper to sequence the entire genome instead of bothering to isolate the gene first and sequence just that part.

 

‹ Prev