by Jamie Metzl
The doctor can’t seem to decide if you’re joking but plays things cautiously. She walks over and sits in the chair beside you. “What I’m saying,” she says softly, enunciating each word, “is that we are beginning to understand the genetic patterns underlying different personality styles, and people who want that information when selecting which of their embryos to implant are entitled to it by law—provided they sign a waiver before getting that information.”
“A person’s personality comes from so many different sources,” you say, still trying to hold on to the magical unknown of being human. “How can you reduce all of that to genetics?”
“We can’t,” the doctor replies, “full stop.” She pauses a moment to let her point settle. “But we can offer statistical probabilities. If you choose to do so, you have the ability to select the embryo from among your six that has the highest statistical likelihood relative to the others of having whichever personality style you choose.”
“Something doesn’t seem right about that,” you say. “It feels like I’d be ordering my child from Starbucks—light on the milk, extra shot of espresso, three pumps of mocha.”
“I’m not here to convince you one way or another,” she says, leaning back. “I’m just explaining your options. It’s really up to you.”
Now your mind is racing. You think back to your own childhood, how surprised your parents were that you were so great at math when neither of them could balance a checkbook. You remember how proud you felt overcoming your shyness to sing in your school talent show. You remember all of the unknown mysteries that unraveled over the course of your life. Would you have felt the same if your parents had selected options for you off a menu? Would they have been as happy when you sang in the talent show or just known you would do it because you had already been genetically optimized for extroversion?
But then again, you counter in your head, all of these embryos are my natural children. One of them will be born into a world where other parents are making these same decisions. If I’m going to invest the coming decades of my life in helping my future child flourish in every way, why wouldn’t I pick the embryo with the best shot? You feel your arm quivering. Your hand inadvertently squeezes the stylus ever harder.
To sign or not to sign, that is the question.
The more we understand about how our genetics work, the better able we will be to select for more traits—and more genetically complex traits—in our future children.
While we must always be humble about the limits of our knowledge, we are also an aggressive and hubristic species that has always pushed our limits. As our ancestors developed the brain capacity that made group hunting, language, art, and complex social structures possible, we began developing the tools to change the environment around us. When we developed agriculture, cities, and medicine we gave the middle finger to nature as we had found it and it had found us.
But even then, the kinds of choices made in our hypothetical fertility clinic will still require that we figure out how much of what we are stems from our innate biology and how much comes from the broader environment around us. If a disease or a trait is only minimally genetic, selecting it during IVF and PGT makes little difference. For dominant single-gene-mutation diseases like Huntington’s disease, or genetic traits like eye color, genes are almost entirely determinative. Lung cancer acquired from a lifetime of smoking, on the other hand, is not.
Fixing our genetic diseases and potentially selecting genetic traits requires that we first figure the extent to which we and each of our disorders or traits are determined by our genes. The process of assessing where biology ends and the environment begins is just another way of describing our age-old debate on the balance between nature and nurture.
Our forebears have debated this for millennia. Plato believed that humans are born with innate knowledge, an assertion later challenged by his star pupil, Aristotle, who argued that knowledge is acquired. Confucius famously wrote in the sixth century BC, “I am not one who was born in the possession of knowledge,” casting an unwitting vote for Aristotle. In the seventeenth century, Descartes expressed his Platonic belief that human beings are born with certain innate ideas that undergird our general approach to and attitude toward the world. Hobbes and Locke, on the other hand, believed that lived experience exclusively determined the characteristics of a person. Almost everyone these days would agree that the answer to the nature versus nurture debate is “both,” because nature and nurture are both dynamic systems continually interacting. That this is undoubtedly true doesn’t make figuring out how genetically determined we are any less important.
This is not just a philosophical question. If we are mostly nature, if our genetics in major ways determine who we are, then fixing a problem or making a change would need to happen on a genetic level. If we are mostly nurture, mostly influenced by the environmental factors around us, then we’d be crazy to think about altering our complex genetics to change outcomes when more benign environmental changes could do the trick.
It’s impossible to draw an exact line between nature and nurture, but generations of twin studies have helped scientists understand the role genetics play in influencing who we are. Identical twins separated at birth provide a particularly great opportunity to better understand the role of genetics. The bigger the role genetics play, the greater the likelihood identical twins raised apart will wind up similar to each other.
Identical twins are almost entirely genetic carbon copies of each other at birth; so if humans were 100 percent genetic beings, these twins would remain fully identical throughout their lives. Schizophrenia in twins is a good test case. About half of identical twins share this chronic brain disorder, compared to less than 15 percent of fraternal twins, suggesting that schizophrenia has a significant genetic foundation. But because all identical twins don’t share the condition, we know that significant environmental and other nongenetic factors are also involved.
Psychologist and geneticist Thomas Bouchard’s multidecade study of twins separated at birth found that identical twins raised separately had about the same chance of sharing personality traits, interests, and attitudes as identical twins raised together.1 We’ve all heard the incredible stories of identical twins separated at birth who reunite later in life to find they are shockingly similar.
Identical twins Jim Lewis and Jim Springer were separated when they were only four weeks old. When they met again thirty-nine years later, in 1979, they found they both bit their nails, had constant headaches, smoked the same brand of cigarettes, and drove the same style of car to the same Florida beach. These stories are not mere anecdotes but indicators of a deeper genetic message. Although the twin studies demonstrated that similarities between twins have more to do with genetic and biological factors than with environmental ones, this did not at all negate the critical importance of love, parenting, family, and all types of nurture.2 Scores of studies like this have been carried out around the world.
In 2015, an enterprising collection of scientists, using findings from most of the twin studies of the previous fifty years, tried to draw conclusions from a mind-boggling collection of 2,748 research publications, exploring 17,804 traits among 14,558,903 pairs of twins in thirty-nine different countries. Using big-data analytics to better pinpoint the balance between genetic and environmental influences, they confirmed that all of the measured human traits are at least partly heritable, but some more than others. At the high end, the measured neurological, heart, personality, ophthalmological, cognitive, and ear-nose-and-throat disorders were found to be mostly genetic. Across all traits, the authors found that the overall average heritability of all of the measured traits was 49 percent.3 If these findings are correct, humans are about half defined by our genetics. This would be good news for classicists. Plato and Aristotle were both correct.
That we are, overall, probably about half nature and half nurture intuitively feels right. Most parents say they immediately sensed their newborn child had a sunny,
anxious, or stormy disposition. Part of this, I am sure, is historical revisionism after a child grows up optimistic, nervous, or belligerent, but part of it is also our recognition that a big part of who we are is based on the biology we inherit. Assuming for the moment that we are about half nature, with some traits being more genetic than others, how far might we go in understanding that genetic and biological part of ourselves?
We know from our intuition and from the research that height is a predominantly genetic trait, but not always. Protein, calcium, and vitamins A and D are essential to helping children grow to their potential. When a disastrous famine struck North Korea in the 1990s, widespread malnutrition stunted the growth of a generation of young North Koreans. The older North Koreans who had come of age prior to the famine were closer to the average South Korean height, but the younger North Koreans who grew up in the 1990s are up to three inches shorter than their South Korean counterparts.4 This tells us that, whatever your genes might predict, height can be stunted if you don’t get the nutrients you need as a child.
If height is mostly genetic, the next step is to figure out which genes have the most to say about how tall we are. There are a small number of single genetic mutations that can make a person very tall or very short. A mutation in the FBN1 gene, for example, can cause Marfan syndrome, a condition that usually makes people very tall and thin with an extra-long arm span. (Olympic swimmer Michael Phelps and U.S. president Abraham Lincoln both showed symptoms of this mutation.) Achondroplasia, on the other hand, is a mutation in the FGFR3 gene that causes short-limb dwarfism.
But examples like these of single-gene mutations having a major impact on height are extremely rare. In most cases, height is influenced by hundreds or even thousands of genes as well as by environmental factors like nutrition.5 Experts believe that about 60 to 80 percent of the difference in height between people is based on genetics.6 This makes intuitive sense. A person is generally not tall because he or she has one, single, elongated part, like the neck of a giraffe. Instead, we are tall because each part of us is a little longer. So far, about 800 different genes believed to influence height in one way or another have been identified.
Although the full list of genetic height determinants has not yet been uncovered, Stephen Hsu, a theoretical physicist and vice president for research at Michigan State University, has done incredible work demonstrating how height can be accurately predicted with only the known genetic factors. Drawing on five hundred thousand sequenced genomes from the UK Biobank, Hsu and his collaborators sought to predict the height of people based on their genetics alone. Once these calculations were made, they compared genetic predispositions to the people’s actual height. Remarkably, they were able to, on average, predict from their genetic data the actual height of a person within about an inch.7
Predicting someone’s height from their genetics is useful for catching growth issues early in a child’s life, but the stakes of this research are considerably higher. Being able to predict this one complex trait opens the door to the possibility of understanding, predicting, selecting, and ultimately altering any complex trait and many complex and heritable genetic diseases. Hsu and others have already applied similar computational algorithms to predicting heel-bone density, and this approach is being used to predict partial genetic predispositions to diseases like Alzheimer’s, ovarian cancer, schizophrenia, and type 1 diabetes, and other polygenic traits, once the sequenced genomes of enough people with each disease and trait have been entered into a shareable database.8
It makes sense that predicting a trait that is entirely genetic would be easier if the trait is influenced by just a few genetic markers, rather than hundreds or thousands. That’s why we need bigger data pools to predict more complex traits than we do for the genetically simpler ones. If a trait is only part genetic, on the other hand, we can only use genetic data sets to predict its heritable portion. If we have both a sense of how heritable a given trait is and a rough guess of the number of genes influencing the trait, we can then begin to estimate how many people’s sequenced genomes and life records would be needed to predict the genetic portion of that trait from genetic data alone. For most common adult chronic diseases, this number is estimated at around a million. For many psychiatric diseases, it is estimated at about one to two million.9 Most complex and polygenic traits can probably be well predicted from overall data sets in the low millions; the larger the quality data sets, the better.
Height and genetic disease risks are extremely complex and mostly genetic traits, but intelligence, among the most important and complex of all human traits, will be even tougher to tackle.
As long as the concept of intelligence has existed, people have debated what it is and how to measure it. General intelligence is extremely difficult to define, even though many people have tried. “The ability to reason, plan, solve problems, think abstractly, comprehend abstract ideas, learn quickly, and learn from experience” captures a lot but also misses a lot.10
Intelligence, like all traits, is only valuable within a particular context, and there are about as many different types of intelligence as there are people. No one is smart, beautiful, or strong in absolute terms. Take someone out of their environment and their form of intelligence may be less or more valuable. Albert Einstein was smarter than most of us, but I’m not entirely sure if his type of brilliance would help to find food or water at a time of scarcity.
But although it is politically tempting to argue that any ranking of intelligence constitutes inherent discrimination, our progress as a species demands that we both rank intelligence in general and for specific tasks and that we do our best to ensure that the most capable people are doing the tasks they are best suited for (lest we have our abstract artists running our nuclear power stations). In a world where people’s genetic capacities might be matched to their roles, and where general intelligence might be prioritized, we would by definition need to respect many types of intelligence so we can maximally benefit from this diversity (while better realizing our own humanity). Saying that intelligence is diverse, however, cannot prevent our recognition that many of its forms are also hierarchical, especially within particular contexts.
Perhaps more than most every other trait, the battle has long been waged about the heritability of intelligence. In the later 1800s, the English scientist and polymath Sir Francis Galton—whom we will meet again later when we explore the travesty of eugenics—attempted to measure and compare the sensory and other qualities of British noblemen with commoners. This biased effort to demonstrate the gentry’s genetic superiority went nowhere but highlights that the very idea of intelligence testing was fraught from the start. A couple of decades later, the French psychologist Alfred Binet designed a series of questions for children to determine which needed special help in school. Based on the belief these questions were ones the average student could answer, those children not able to answer the questions were considered to possess lower than average intelligence.
This idea of having a standard, average intelligence with people either above or below that bar was further developed by the German psychologist William Stern, who fixed that average intelligence quotient, IQ, at 100. If someone had an IQ of 120, they would, according to this model, have a 20 percent higher IQ than the average. An IQ of 80 would be 20 percent lower. American psychologist Charles Spearman recognized that children’s cognitive abilities correlated across multiple subjects—that ones who did well in one area tended to do well in others—and created the concept of a general factor intelligence.11
Over the ensuing years, the concept of IQ spread rapidly around the world and was used in countless organizations, most famously the U.S. military during World War I, to assess a general aptitude applicable to multiple tasks. Although IQ tests probably did help the U.S. military measure certain types of aptitude, the outcome of tests showing IQ disparities between groups, ominously, were also used by some high-profile scholars like Princeton’s Carl Brigham to argue that immigra
tion and racial integration would weaken the American gene pool.
Despite this questionable background, IQ, as many studies have shown and as our hypothetical fertility doctor explained, correlates well with a person’s health, education level, prosperity, and longevity.12 Many of the general cognitive abilities most beneficial to our ancestors—including memory, pattern recognition, language ability, and proficiency at math—are positively correlated, and people with high IQs tend to score well on most other cognitive tests. For many critics, however, IQ remains self-validating, inaccurate, disrespectful of difference, racially and socioeconomically biased, dangerous, and, overall, highly questionable.13
This debate reached a fever pitch in the United States following the publication of the 1994 book The Bell Curve: Intelligence and Class Structure in American Life by Richard Herrnstein and Charles Murray, which described intelligence as the new dividing line in American society. At the start of the book, the authors take the defensible position that general factor cognitive ability can be reliably measured, differs between people, and is somewhere between 40 and 80 percent heritable. Although Herrnstein and Murray recognized that intelligence has both genetic and environmental components, they controversially suggested that genetics explains why some groups score lower than others on IQ tests and that restricting government incentives for poor women to procreate would increase average IQ in the United States.14 By simultaneously raising the taboo topics of intelligence differences between people and groups, IQ, and race, Herrnstein and Murray smashed into the buzz saw of progressive public opinion.
New York Times columnist Bob Herbert called The Bell Curve “a scabrous piece of racial pornography masquerading as serious scholarship.”15 Harvard’s Stephen Jay Gould, a longtime critic of the concept of general factor intelligence, argued that other environmental factors like prenatal nutrition, home life, and access to quality education had a more significant influence on a person’s intelligence than Herrnstein and Murray accounted for.16 Other critics attacked the book as reductionist, scientifically sloppy, and dangerously biased.17