Hacking Darwin
Page 4
As the technology has gotten more accurate and powerful, costs have gone down dramatically. The chart below gives some indication of how quickly the cost of genome sequencing has decreased over the past decade and a half.
COST PER GENOME
The cost of sequencing a human genome compared with the reductions that would be expected at the rate Moore’s law predicts for computer chips. Over the past decade, next-generation sequencing and cloud computing drove the price of sequencing down. The average bumped higher in recent years because of brief slowdowns in production.
Source: “The Cost of Sequencing a Human Genome,” NIH, last modified July 6, 2016, https://www.genome.gov/27565109/the-cost-of-sequencing-a-human-genome/.
Today, sequencing a full genome takes about a day and costs about $700. Illumina CEO Francis deSouza announced in early 2017 that the company expected to be able to sequence a full genome for about $100 in the not-distant future. As the cost of sequencing falls close to the cost of materials required and genome sequencing becomes commoditized, more data will be available at less expense. Because genomics is the ultimate big-data challenge, more and cheaper data will lay a foundation for more and greater discoveries.
But even if sequencing were entirely ubiquitous, commoditized, and free, it wouldn’t mean a thing unless scientists were able to understand what the genomes were saying.
If a Martian came to Earth wanting to learn about how humans organize information, she would need to figure that we had things called books. Then she would need to figure out that these books had pages made up of words formed by letters. That’s the equivalent of what we get when identifying that DNA is organized into genes packaged in chromosomes that code for proteins instructing cells what to do. If the Martian would then want to understand what the books actually said, she’d need to figure out what the words mean and how to read them. Similarly, once scientists figured out the basics of how genes were organized, they still needed to figure out what the genes were actually doing.
The good news is that they had an increasing number of tricks up their sleeve. As researchers sequenced more individual worms, flies, mice, and other relatively simple “model organisms” used to help understand more general biological processes, they tried to correlate differences between similar types of organisms and differences in their genes. Once they formed a hypothesis, they bred organisms with the same genetic mutation to see if the resulting offspring expressed the same trait. Eventually, scientists were able to turn on and off different genes in living animals to observe how specific traits changed as a result. They used advanced computational tools to analyze the interactions of many genes and made broader association studies to analyze ever-larger genetic data sets.
Understanding genetic data sets would be complicated enough if all biology was based on gene expression alone, but it is significantly more complicated. The genome is itself an incredibly complex ecosystem that interacts both with other complex systems inside an organism and with the changing environment around it. A small percentage of traits and diseases result from the expression of single genes, but most come from a group of genes working together and interacting with the broader environment.
No one really knows the exact number, but it has been estimated that hundreds or thousands of genes play a role in determining complex traits like intelligence, height, and personality style. These genes don’t act alone. Ribonucleic acid, or RNA, once believed to be merely a messenger between DNA and the protein-making machinery of the cell, is now understood to play an important role in gene expression. The epigenetic marks help determine how genes are expressed. Understanding how these overlapping processes influenced complex genetic traits was way too difficult in the first phase of genomics research, but figuring out the relatively small percentage of traits and diseases caused by single gene mutations was more feasible.
Cystic fibrosis, Huntington’s disease, muscular dystrophy, sickle cell disease, and Tay-Sachs are all examples of single-gene mutation diseases, also known as Mendelian diseases because they clearly follow Mendel’s rules of heredity. Some of these disorders are called dominant because a child will need to inherit just one copy of a mutation from a single parent to have the disease. For recessive disorders, like Tay-Sachs, a child would need to inherit the mutation from both parents to be at risk. (In some rare instances people with these mutations don’t get the particular disease, most likely because other genes counteract the mutation.) Of the approximately twenty-five thousand Mendelian diseases that have so far been identified, about ten thousand are understood well enough to match a specific gene to a specific disease outcome.8 Today treatments exist only for around 5 percent of these.
These Mendelian diseases are very rare. Only one out of every thirty thousand people, for example, is born in the United States with cystic fibrosis; one of ten thousand inherits Huntington’s disease, and one of 7,250 males inherit Duchenne muscular dystrophy. One of every 365 African American children is born with sickle cell disease, a disease more prominent among groups whose recent ancestors lived in highly malarial areas. Other Mendelian disease can be one in millions or even tens of millions or more.9* Many of these diseases cause terrible suffering and even premature death. But because they are so rare, society as a whole generally has less incentive to invest in finding cures to these diseases than in finding cures to more common afflictions, like cancer or heart and lung disease, that impact segments of the population with greater numbers, voice, and political power. Although some new research suggests that variants in Mendelian genes might play a greater role in more common diseases like metastatic prostate cancer, these preliminary findings have so far not shifted the incentive structure.10
With so many rare genetic diseases unlikely to receive the attention and resources needed to generate cures, parents and at-risk communities, inspired by the new insights coming from genetic technologies, have started to look for their own ways to protect their future children.
Children born with Tay-Sachs, a genetic disease resulting from a single genetic mutation on chromosome 15, often seem fine at birth, but the destruction of their nervous systems begins soon after. By around the age of two, most are experiencing terrible seizures and the decline of mental capacity. Many become blind and nonresponsive. Most die in agony before the age of five. About one in every twenty-seven Ashkenazi Jews are carriers of the Tay-Sachs mutation and hundreds of Jews around the world used to die of the disease each year. Today almost none do, a miracle of both science and social organization.
After scientists in 1969 identified the enzyme associated with Tay-Sachs carriers, a blood test was developed to determine carrier status among prospective parents—and Jewish communities worldwide swung into gear. Jewish community centers and synagogues in the United States, Canada, Europe, Israel, and elsewhere held screenings. Couples in which both prospective parents were carriers were advised to adopt or to get tested after pregnancy. Those mothers carrying embryos with the disease almost always chose to terminate their pregnancies—a painful choice but perhaps less painful than watching their future children die of the disease. The Orthodox Jewish community empowered matchmakers to have marriage candidates genetically tested and steer carriers away from marrying each other.
With the advent of gene sequencing, the genetic mutation responsible for Tay-Sachs was identified in 1985 and multiple mutations of the responsible gene have been identified since. Tay-Sachs is now an exceedingly rare disease among Jewish populations.
In light of the proven benefits of genetic screening for Tay-Sachs, some researchers and policymakers are now calling for what they call “expanded carrier testing” to assess whether other categories of prospective parents have the potential to pass Mendelian diseases and disease-risks to their children.11
Genome sequencing and biochemical enzyme level measurement were monumental breakthroughs that began to help prevent the transmission of relatively simple genetic diseases, but genetic analysis alone couldn’t transform the way h
umans make babies unless paired with new options for applying that knowledge.† The paired revolutions of in vitro fertilization, or IVF, and embryo screening created the mechanism through which genetic analysis could fundamentally transform human baby-making. These revolutions had been a long time coming.
Studying rabbit eggs in 1878, more than a century after Spallanzani’s experiments with frog condoms, Viennese embryologist Samuel Leopold Schenk—who, coincidentally, studied at the University of Vienna the same time as Gregor Mendel—discovered that when he added sperm to the eggs he isolated in a glass dish, the eggs started dividing. These were the very early years of understanding the reproduction process, but Schenk correctly surmised the eggs were being fertilized. That mammal eggs could be fertilized in a dish suggested that these fertilized eggs could potentially be implanted into the mother and taken to term. In theory, yes; but in practice, not yet. It would take another eighty years until American scientist M. C. Chang successfully impregnated a rabbit with an egg fertilized in a glass dish—or, using the Latin phrase, in vitro. But making a bunny was still a far cry from making a human baby. That, too, was coming.
At a historic meeting at the Royal Society of Medicine in London in 1968, biomedical researcher Robert Edwards, one of the world’s leading experts on human egg development, approached the leading developer of the surgical process for inspecting a woman’s pelvis, obstetrician Patrick Steptoe. Edwards proposed they explore whether human in vitro fertilization could be used to treat infertility. Over the coming decade, the two worked feverishly and published a dazzling string of high-profile scientific papers describing every aspect of what would be required to make human in vitro fertilization possible.
In 1972, Steptoe and Edwards started human trials. Working with nurse Jean Purdy, they carefully extracted eggs from more than one hundred different women, fertilized them with sperm, and then tried to surgically implant the now-fertilized eggs into the prospective mothers. Every one of these efforts failed. In 1976, a woman finally became pregnant with an egg fertilized in vitro, but the pregnancy failed when the early-stage embryo attached outside the uterus. Then, in 1977, Leslie Brown, a homemaker from Bristol, England, entered the clinic. Leslie and her husband, John, a railroad worker, had been trying without success to have a baby for nine years and were desperate.
Leslie’s pregnancy took with the first fertilized egg implanted. Nine months later, on July 25, 1978, their healthy baby, Louise, was born. Newspapers across the globe heralded the “baby of the century.” When asked just a few months later, a shocking 93 percent of Americans said they had heard about the English baby born from an egg fertilized outside her mother.12
Source: “The Birth of the World’s First Test-Tube Baby Louise Brown in 1978,” News East West, July 21, 2013, https://bit.ly/2J3Ymcr.
Although Louise was conceived in a dish, the popular perception was that she, and babies like her, were created in a test tube. The pejorative name test-tube babies stuck. Many people, like the majority of Americans polled by Pew that year, had a favorable view of this process.13 Others felt differently.
Catholic theologians called the test-tube baby process “unnatural” and a “moral abomination” because it did not involve sexual consummation between husband and wife and because the process generated embryos that were not implanted and therefore needed to be discarded.14 The American Medical Association opposed test-tube baby-making as too aggressive. Nova magazine called it “the biggest threat since the atom bomb.” Leading conservative bioethicist Leon Kass said it called into question “the idea of the humanness of our human life and the meaning of our embodiment, our sexual being, and our relation to ancestors and descendants.”15 Leslie and John Brown were inundated with hate mail, including blood-spattered parcels containing plastic fetuses.
But, as in many such situations, a process that was once shocking and controversial became more accepted and normalized over time. As the science of “test-tube babies” became less controversial and developed a more technical name, a group of scientists was already imagining the next frontier. Why couldn’t cells, they wondered, be taken from an early-stage preimplanted embryo during IVF and then sequenced using ever-advancing sequencing technology?
As early as 1967, IVF pioneer Robert Edwards and his British colleague Richard Gardner described their process for removing a few cells from preimplanted rabbit embryos and screening them under the microscope to determine the sex of the future bunny.16 In 1990, twelve years after Louise Brown was born, doctors, for the first time, successfully screened a preimplanted human embryo for gender and a few sex-linked and single-gene disorders. This screening process became known as preimplantation genetic diagnosis, or PGD. The PGD procedure developed rapidly, particularly for higher-risk prospective mothers. A parallel and related process called preimplantation genetic screening, or PGS, was also developed to screen embryos without a known disease risk to assess their chances of thriving. PGD and PGS have more recently been grouped together semantically under the broader umbrella label of preimplantation genetic testing, or PGT.
PGT has been around for nearly thirty years, but these are still the early days of this incredibly significant procedure. At first, scientists used PGT primarily to test for the chromosomal abnormalities that can cause miscarriages. It then started to be used to test for a small number of specific single gene disease-causing mutations. Today, PGT is also used to screen for some of the estimated ten thousand single-gene mutation disorders.17 Unlike prenatal testing of embryos already in the mother’s womb, PGT can be carried out on multiple, early-stage fertilized eggs, or blastocysts, in a dish.
In most cases, diseases that PGT tests for in these unimplanted embryos are exceedingly rare individually. Collectively, however, they are not. Statistics vary, but recent studies estimate the likelihood of having a traditionally conceived child who carries one of these diseases at about 1 to 2 percent.18 For the rapidly growing number of screenable single-gene-mutation diseases, the likelihood of a child conceived through IVF and PGT carrying the disease would be massively reduced.19
As the number of harmful genetic abnormalities that can be avoided by using IVF and PGT increases, parents will need to weigh the costs and benefits of conceiving children through sex versus in the lab. And while the considerable health and other benefits of conception inside the woman through sex will remain constant—and some small, additional risk associated with the IVF process itself could conceivably be uncovered—the real and perceived health benefits of IVF and embryo screening are likely to increase over time.
Think about all the precautions parents take to protect their children from harm and help them thrive. Mothers swallow prenatal vitamins, douse their own hands and their children’s hands in antibacterial sanitizers, have their young kids wear seat belts in cars and helmets on bikes, and serve their children healthy foods. Even though the risk of each different danger varies, modern parents have come to believe that a big part of their job description involves reducing these risks as much as possible and often disdain other parents who make different choices. Most American parents’ response to the anti-vaccination movement is a case in point.
When 147 mostly unvaccinated children were infected with measles in 2015, after exposure at Disneyland, the children’s parents were roundly condemned for putting hundreds of other kids in danger.20 While anti-vaccination advocates argue they are doing something “natural” by not vaccinating their children for communicable diseases, it is hard to argue they are actually doing something good.
Vaccinations have saved millions of lives since the first smallpox vaccine was introduced in nineteenth-century England. Repeated studies around the world have clearly proven the safety and overwhelming individual and communal benefits of vaccination.21 Nevertheless, irrational and uninformed fears of vaccines have persisted. In recent years, celebrities like Jenny McCarthy, Jim Carrey, and Donald Trump22 have raised scientifically unsupported claims about the dangers of vaccines that have fueled a quad
rupling of the number of unvaccinated U.S. children since 2001.23 This same type of conflict between groups of parents taking action to harness or reject as unnatural scientific advances will also play out with embryo screening.
With the increasing quality and ease of noninvasive prenatal blood tests, many parents already have ever more information about the genetic status of the embryo growing inside the mother. But the anguish of deciding whether to terminate a pregnancy based on genetic abnormalities that could lead to later-stage problems will seem more painful and less beneficial than selecting up front a preimplanted embryo based on statistical probabilities of health.24
As the number of single-gene-mutation diseases that can be screened for during IVF and PGT continues to rise, the cost goes down, and the safety of IVF and PGT improves, the value of screening and selecting embryos in the laboratory prior to implantation will increase. At first, parents will balance their faith in reproduction by sex against the benefits of embryo screening. This will not, over time, be a fair fight. With more genetic diseases becoming avoidable, parents who conceive children the old-fashioned way will seem like today’s anti-vaccination zealots.
As societal norms about baby-making change, more prospective parents will come to see conception through sex as unnecessarily risky. We’ll still have sex for all the wonderful reasons we do now, just not as much for making babies. More parents will want their children to be conceived outside the mother so the embryos can be sequenced, selected, and, in the more distant future, altered.