The United Nations Conference on Climate Change in Paris was an example of a winning relationship between science and policy, even though it took too long to happen and achieved too little. Politicians will always be responsible for decision making, but such urgent global problems need a better sort of “science-based policy.” It is a question not only of politicians listening to scientists but also of science that is effective in self-governing and developing better-informed citizens.
Gene editing provides an important case study. The U.S. National Academy of Sciences, the National Academy of Medicine, the Chinese Academy of Sciences, and the U.K.’s Royal Society recently held a summit in Washington with international experts to discuss the scientific, ethical, and governance issues associated with human gene-editing research. The idea was to call for a moratorium on using the CRISPR/Cas9 technology to edit the human genome in a permanent and heritable way, because unintended consequences were to be expected. Among those calling for the moratorium were CRISPR’s inventors, but the summit ended with no big decisions. The national academies opted for a continuing discussion.
One winning argument for not making decisions was made by George Church of Harvard, who works in the field of human gene-editing: He opposed the idea of a ban by claiming it would strengthen underground research, black markets, and medical tourism—suggesting that science in a globalized economy is pretty much out of control. This is the kind of story one reads behind the lines in the news.
The debate about artificial intelligence, led by Stephen Hawking, was also about science out of control. As are discussions about robots that can take over human jobs. Science facts and news are creating a big question mark: Is science out of control? Could it be different? Could a science exist that was under control?
The old answer was more or less the following: Science is about finding things out; ethics or policy will serve to decide what to do about them. That kind of answer is no help anymore, because science is very much able to change how things are, while the growing demand for science-based policy enables it to take part in the decision-making process. If a scientific narrative converges with laissez-faire ideology and the idea of complexity, the decision-making process becomes increasingly difficult and the situation seems to go out of control.
Science needs to do something about this. Ethics informs individual decision making but needs help dealing with complexity. Policy makes collective decisions but needs theories about the way the world is changing. Science is called to take part in decision making. But how can it, without losing its soul?
There cannot be a science under control. But there can be a science that knows how to empirically deal with choices and gets better at self-government. The current “news that will stay news” may be the failure of the Paris conference scientists to decide about human gene-editing: It is a story that will stay news until an improved “science of the consequences” story comes along. Thus the scientific method must take into account the consequences of research. If the decision-making process is no longer confined to ethics and politics, epistemology is called upon to spring into action.
Creation of a “No Ethnic Majority” Society
David Berreby
Journalist; author, Us and Them
Throughout the history of the United States, white people have been the dominant ethnic group. The exact definition of this “race” has changed over time, as successive waves of immigrants (Germans in the 18th century, Irish in the 19th, Italians and Jews in the late 19th and early 20th) worked to be included in the privileged category (as recounted, for example, in Noel Ignatiev’s How the Irish Became White). Whatever “whiteness” meant, though, its predominance persisted—both statistically and culturally (as the no-asterisk default definition of “American”). Even today, long after the legal structure of discrimination was undone, advantages attach to white identity in the job market, housing, education, or in any encounter with authority. Not unrelatedly, life expectancy for whites is greater than for African-Americans.
But this era of white predominance is ending.
Not long after 2040, fewer than half of all Americans will identify as white, and the country will become a majority-minority nation—47 percent white, 29 percent Hispanic, 13 percent African-American, and 11 percent “other,” according to U.S. Census Bureau projections. Given this demographic shift, the habits and practices of a white-dominated society cannot endure much longer. Political, legal, cultural, and even personal relations between races and ethnic groups must be renegotiated. In fact, this inevitable process has already begun. And that’s news that will stay news, now and for a long time to come. It is driving a great deal of seemingly unrelated events in disparate realms, from film criticism to epidemiology.
I’ll begin with the most obvious signs. In the past two years, non-whites have succeeded as never before in changing the terms of debates that once excluded or deprecated their points of view. This has changed both formal rules of conduct (for police, for students) but also unwritten norms and expectations. Millions of Americans have recently come to accept the once fringe idea that police frequently engage in unfair conduct based on race. And many now support the removal of memorials to Confederate heroes, and their flag, from public places.
Meanwhile, campuses host vigorous debates about traditions that went largely unquestioned two or three years ago. (It is now reasonable to ask why, if Princeton wouldn’t name a library after Torquemada, it should honor the fiercely racist Woodrow Wilson.) The silliness of some of these new disputes (like Oberlin students complaining that the college dining hall’s Chinese food was offensive to Chinese people) shouldn’t obscure the significance of the trend. We are seeing inevitable ethnic renegotiation, as what was once “harmless fun” (like naming your football team the Redskins) is redefined as something no decent American should condone.
It’s nice to imagine this political and cultural reconfiguring as a gentle and only slightly awkward conversation. But evidence suggests that the transition will be painful and its outcome uncertain.
Ethnic identity (like religious identity, with which it is often entangled) is easy to modify over time but difficult to abandon. This is especially true when people believe their numbers and influence are declining. In that situation, they become both more aware of their ethnicity and more hostile to “outsiders.” (In a 2014 paper, for example, the social psychologists Maureen A. Craig and Jennifer A. Richeson found that white citizens who’d read about U.S. demographics in 2042 were more likely to agree with statements like “It would bother me if my child married someone from a different ethnic background” compared with whites who had read about 2010’s white-majority demographics.*) Such feelings can feed a narrative of lost advantage even when no advantage has been lost. Though whites remain privileged members of American society, they can experience others’ gains toward equality as a loss for “our side.” The distress of white people over the loss of their predominance—a sense that “the way things were before was better”—has rewarded frankly xenophobic rhetoric and the candidates who use it.
We should not imagine, though, that this distress among some whites is manifested merely rhetorically. In a recent analysis of statistics on sickness and death rates, the economists Anne Case and Angus Deaton found that middle-aged white people in the United States have been dying by suicide, drug abuse, and alcohol-related causes at extraordinary rates.* The historian and journalist Josh Marshall has pointed out that this effect is strongest among the people who, lacking other advantages, had the most stake in white identity: less educated, less skilled, less affluent workers. (Other scholars have disputed details of Case and Deaton’s analysis but not its overall point.)
If the Case-Deaton statistics reflected only economic distress, then middle-aged working-class people of other ethnic groups should also be missing out on the general health improvements of the last few decades. This is not so. Unskilled middle-aged African-Americans, for example, have lower life expectancy than equiv
alent whites. Yet their health measures continually improved over the period during which those of whites stalled.
For this reason, I think Marshall is right and the Case-Deaton findings signal a particularly racial distress. The mortality rates correlate with loss of privilege, of unspoken predominance, of a once undoubted sense that “the world is ours.” Over the next ten or twenty years, this ongoing news could turn into a grim story of inter-ethnic conflict.
Can scientists and other intellectuals do anything to help prevent the inevitable ethnic reconfiguration from being interpreted as a zero-sum conflict? I think they can. For one thing, much is unknown about the psychology, and even physiology, of loss of ethnic advantage. Much could be learned by systematic comparative research on societies in which relations among social groups were swiftly renegotiated so that one group lost privilege. South Africa after the fall of apartheid is one such place; Eastern Europe during and after the fall of Communism may be another.
We could also sharpen up our collective understanding of the slippery psychology of ethnic threat with an eye toward finding methods to understand and cope with such feelings. To do that, we need to take people’s perceptions about identity seriously. Happy talk about the wonders of diversity and the arc of history bending toward justice will not suffice. We need to understand how, why, and when some people on this inevitable journey will experience it as a loss.
Interconnectedness
Irene Pepperberg
Research associate, lecturer, Harvard University; adjunct associate professor, Brandeis University; author, Alex & Me
No man is an island
Entire of itself . . .
John Donne wrote these words almost 400 years ago, and (aside from the sexism of the male noun) his words are as true now as they were then. I believe they will be just as true in the future, and apply to scientific discovery as well as to philosophy. The interconnectedness of humans, and of humans and their environment, that science is demonstrating today is just the beginning of what we will discover and is the news very likely to be discussed in the future.
From the science of economics to that of biology, we are learning how the actions and decisions of each of us affect the lives of all others. The coal-fired power plants of India, China, and elsewhere affect the climate of us all, as does the ongoing deforestation of the Amazon. The nuclear disaster in Japan shaped how we view one alternative energy source. But we now know that our health (particularly our microbiome) is affected not only by what we put into our mouths but, somewhat surprisingly, also by the company we keep. Recent studies show that decisions about the removal of an invasive species affect its entire surrounding ecological web as much as decisions concerning the protection of an endangered species.
One need not necessarily buy into Donne’s somewhat dark worldview to appreciate the importance of his words. Interconnectedness means that the scientists of the world work to find a cure for a disease such as Ebola that has, so far, primarily been limited to a few countries. It also means that governments recognize how reacting to the plight of refugees from wartorn areas halfway around the globe could be a means of enriching rather than impoverishing one’s country.
Whether we look at social media, global travel, or any other form of interconnectedness, news of its importance is here to stay.
Early Life Adversity and Collective Outcomes
Linda Wilbrecht
Associate professor, UC Berkeley Department of Psychology and Helen Wills Neuroscience Institute
When fear and tension rise across racial and ethnic divisions, as they have in recent years, genetic arguments to explain behavioral differences can quickly become popular. However, we know racial and ethnic groups may also be exposed to vastly different experiences likely to strongly affect behavior. Despite our seemingly inexhaustible interest in the nature/nurture debate, we are only starting to learn how the interaction of genes with experience may alter the potential of individuals, and to see how individual decision-making styles can alter the potential wealth of nations.
A recent captivating news image depicted two sets of male identical twins, mixed up as infants and raised in separate families in Colombia. The boys and their families assumed they were fraternal twins, who share genes only as much as siblings do and therefore don’t look alike. Only in adulthood did the young men discover the mistake and find their identical twin brothers through the recognition of friends. One mixed pair of twins grew up in the city and the other in the countryside with far more modest resources. We would like to know how these different environments altered these men’s personalities, preferences, intelligence, and decision making, when their genes were the same. We’re all probably comfortable with the idea that trauma, hardship, or parenting style can affect our emotional development and emotional patterns even in adulthood. But it’s less clear how early experience might affect how we think and make decisions. Is one identical twin, because he was raised in a different situation, more likely to save money, repeat a mistake, take a shortcut, buy lottery tickets, resist changing his mind? Or would the identical-twin pair make the same choices regardless of upbringing? The answers from these twins are still emerging, and the sample is, of course, anecdotally small. If we knew the answers, it might change how we view parenting and investment in child care and education.
A growing body of work now effectively models this “twins raised apart” situation in genetically identical strains of inbred mice or genetically similar rats. Rodents get us away from our cultural biases and can be raised in conditions that model human experiences of adversity and scarcity in infancy and childhood. In one early-life stress model, the mother rodent is not given adequate nesting material and moves about the cage restlessly, presumably in search of more bedding. In other models, the rodent pups are separated from the mother for parts of the day or are housed alone after weaning. These offspring are then compared to offspring that have been housed with adequate nesting material and have not been separated more than briefly from their mother or siblings.
Investigators first focused early-life-adversity research in rodent models on emotional behavior. This research found that early-life adversity increased adult stress and anxious behavior. More recent studies, including some from my lab, find that early-life adversity can also affect how rodents think—how they solve problems and make decisions. Rats and mice subjected to greater early-life stress tend to be less cognitively flexible (stubbornly applying old rules in a laboratory task after new rules are introduced), and they may also be more repetitive or forgetful. Some of the differences in behavior have faded as the animals age, but others grow stronger and persist into adulthood. I hesitate to say that one group is smarter, since it’s hard to determine what would be optimal for a wild rodent. We might see stubborn, inflexible, or repetitive behavior as unintelligent in a laboratory test. In some real-world situations, the same behavior might be admirable, as “grit” or perseverance.
Competing theories attempt to explain these changes in emotional behavior, problem solving, and decision making. The brains of the mice that experienced adversity may be dysfunctional, in line with evidence of atrophy of frontal neurons after stress. Alternatively, humans and rodents may show positive adaptation to adversity. In one model currently growing in popularity, a brain developing under adversity may adopt a “live fast–die young” strategy favoring earlier maturation and short-term decision making. In this adaptive calibration model, genetically identical animals might express different sets of genes in their brains, and develop different neural circuits, in an attempt to prepare their brains for success in the kind of environments in which they found themselves. It is unclear how many possible trajectories might be available or when or how young brains are integrating environmental information. However, based on this adaptive calibration model, in lean times versus fat times you might expect a single species to “wire-up” different brains and display different behaviors without genetic change or genetic selection at the germline leve
l.
Why should we care? This data might explain population-level economic behavior and offer a powerful counternarrative to seductive genetic explanations of success. Another piece of captivating news out recently was Nicholas Wade’s review of Garett Jones’s Hive Mind in the Wall Street Journal. Reading this review, I learned that national savings rates correlate with average IQ scores even if individual IQ scores do not; the book’s subtitle is “How Your Nation’s IQ Matters So Much More Than Your Own.” In his review, Wade suggests that we need to look to “evolutionary forces” to explain IQ and its correlated behavioral differences. The well-controlled data from experiments in mice and rats suggest that we also look to early-life experience. Rodents are not known for their high IQ, but the bottom line is that what intelligence they have is sensitive to early-life experience even when we hold germline genetics constant. Putting these rodent and human findings together, one might hypothesize that humans exposed to instability or scarcity in early life are developing brains wired for shorter-term investment and less saving. Thus, rather than attributing the successes and failures of nations to slow-changing genetic inheritance, we might foster a brighter future by paying more attention to the quality of early-life experiences.
Know This Page 39