Enlightenment Now

Home > Nonfiction > Enlightenment Now > Page 46
Enlightenment Now Page 46

by Steven Pinker


  pragmatic experts who drew on many analytical tools, with the choice of tool hinging on the particular problem they faced. These experts gathered as much information from as many sources as they could. When thinking, they often shifted mental gears, sprinkling their speech with transition markers such as “however,” “but,” “although,” and “on the other hand.” They talked about possibilities and probabilities, not certainties. And while no one likes to say “I was wrong,” these experts more readily admitted it and changed their minds.47

  Successful prediction is the revenge of the nerds. Superforecasters are intelligent but not necessarily brilliant, falling just in the top fifth of the population. They are highly numerate, not in the sense of being math whizzes but in the sense of comfortably thinking in guesstimates. They have personality traits that psychologists call “openness to experience” (intellectual curiosity and a taste for variety), “need for cognition” (pleasure taken in intellectual activity), and “integrative complexity” (appreciating uncertainty and seeing multiple sides). They are anti-impulsive, distrusting their first gut feeling. They are neither left-wing nor right-wing. They aren’t necessarily humble about their abilities, but they are humble about particular beliefs, treating them as “hypotheses to be tested, not treasures to be guarded.” They constantly ask themselves, “Are there holes in this reasoning? Should I be looking for something else to fill this in? Would I be convinced by this if I were somebody else?” They are aware of cognitive blind spots like the Availability and confirmation biases, and they discipline themselves to avoid them. They display what the psychologist Jonathan Baron calls “active open-mindedness,” with opinions such as these:48

  People should take into consideration evidence that goes against their beliefs. [Agree]

  It is more useful to pay attention to those who disagree with you than to pay attention to those who agree. [Agree]

  Changing your mind is a sign of weakness. [Disagree]

  Intuition is the best guide in making decisions. [Disagree]

  It is important to persevere in your beliefs even when evidence is brought to bear against them. [Disagree]

  Even more important than their temperament is their manner of reasoning. Superforecasters are Bayesian, tacitly using the rule from the eponymous Reverend Bayes on how to update one’s degree of credence in a proposition in light of new evidence. They begin with the base rate for the event in question: how often it is expected to occur across the board and over the long run. Then they nudge that estimate up or down depending on the degree to which new evidence portends the event’s occurrence or non-occurrence. They seek this new evidence avidly, and avoid both overreacting to it (“This changes everything!”) and underreacting to it (“This means nothing!”).

  Take, for example, the prediction “There will be an attack by Islamist militants in Western Europe between 21 January and 31 March 2015,” made shortly after the Charlie Hebdo massacre in January of that year. Pundits and politicians, their heads spinning with the Availability heuristic, would play out the scenario in the theater of the imagination and, not wanting to appear complacent or naïve, answer Definitely Yes. That’s not how superforecasters work. One of them, asked by Tetlock to think aloud, reported that he began by estimating the base rate: he went to Wikipedia, looked up the list of Islamist terrorist attacks in Europe for the previous five years, and divided by 5, which predicted 1.2 attacks a year. But, he reasoned, the world had changed since the Arab Spring in 2011, so he lopped off the 2010 data, with brought the base rate up to 1.5. ISIS recruitment had increased since the Charlie Hebdo attacks, a reason to poke the estimate upward, but so had security measures, a reason to tug it downward. Balancing the two factors, an increase by about a fifth seemed reasonable, yielding a prediction of 1.8 attacks a year. There were 69 days left in the forecast period, so he divided 69 by 365 and multiplied the fraction by 1.8. That meant that the chance of an Islamist attack in Western Europe by the end of March was about one in three. A manner of forecasting very different from the way most people think led to a very different forecast.

  Two other traits distinguish superforecasters from pundits and chimpanzees. The superforecasters believe in the wisdom of crowds, laying their hypotheses on the table for others to criticize or amend and pooling their estimates with those of others. And they have strong opinions on chance and contingency in human history as opposed to necessity and fate. Tetlock and Mellers asked different groups of people whether they agreed with statements like the following:

  Events unfold according to God’s plan.

  Everything happens for a reason.

  There are no accidents or coincidences.

  Nothing is inevitable.

  Even major events like World War II or 9/11 could have turned out very differently.

  Randomness is often a factor in our personal lives.

  They calculated a Fate Score by adding up the “Agree” ratings for items like the first three and the “Disagree” ratings for items like the last three. An average American is somewhere in the middle. An undergraduate at an elite university scores a bit lower; a so-so forecaster lower still; and the superforecasters lowest of all, with the most accurate superforecasters expressing the most vehement rejection of fate and acceptance of chance.

  To my mind, Tetlock’s hardheaded appraisal of expertise by the ultimate benchmark, prediction, should revolutionize our understanding of history, politics, epistemology, and intellectual life. What does it mean that the wonkish tweaking of probabilities is a more reliable guide to the world than the pronouncements of erudite sages and narratives inspired by systems of ideas? Aside from smacking us upside the head with a reminder to be more humble and open-minded, it offers a glimpse into the workings of history on the time scale of years and decades. Events are determined by myriad small forces incrementing or decrementing their likelihoods and magnitudes rather than by sweeping laws and grand dialectics. Unfortunately for many intellectuals and for all political ideologues, this is not the way they are accustomed to thinking, but perhaps we had better get used to it. When Tetlock was asked at a public lecture to forecast the nature of forecasting, he said, “When the audience of 2515 looks back on the audience of 2015, their level of contempt for how we go about judging political debate will be roughly comparable to the level of contempt we have for the 1692 Salem witch trials.”49

  * * *

  Tetlock did not assign a probability to his whimsical prediction, and he gave it a long, safe deadline. It certainly would be unwise to forecast an improvement in the quality of political debate within the five-year window in which prediction is feasible. The major enemy of reason in the public sphere today—which is not ignorance, innumeracy, or cognitive biases, but politicization—appears to be on an upswing.

  In the political arena itself, Americans have become increasingly polarized.50 Most people’s opinions are too shallow and uninformed to fit into a coherent ideology, but in a dubious form of progress, the percentage of Americans whose opinions are down-the-line liberal or down-the-line conservative doubled between 1994 and 2014, from 10 to 21 percent. The polarization has coincided with an increase in social segregation by politics: over those twenty years, the ideologues have become more likely to say that most of their close friends share their political views.

  The parties have become more partisan as well. According to a recent Pew study, in 1994 about a third of Democrats were more conservative than the median Republican, and vice-versa. In 2014 the figures were closer to a twentieth. Though Americans across the political spectrum drifted leftward through 2004, since then they have diverged on every major issue except gay rights, including government regulation, social spending, immigration, environmental protection, and military strength. Even more troublingly, each side has become more contemptuous of the other. In 2014, 38 percent of Democrats held “very unfavorable” views of the Republican Party (up from 16 percent in 1994), and more than
a quarter saw it as “a threat to the nation’s well-being.” Republicans were even more hostile to Democrats, with 43 percent viewing the party unfavorably and more than a third seeing it as a threat. The ideologues on each side have also become more resistant to compromise.

  Fortunately, a majority of Americans are more moderate in all these opinions, and the proportion who call themselves moderate has not changed in forty years.51 Unfortunately, it’s the extremists who are more likely to vote, donate, and pressure their representatives. There is little reason to think that any of this has improved since the survey was conducted in 2014, to put it mildly.

  Universities ought to be the arena in which political prejudice is set aside and open-minded investigation reveals the way the world works. But just when we need this disinterested forum the most, academia has become more politicized as well—not more polarized, but more left-wing. Colleges have always been more liberal than the American population, but the skew has been increasing. In 1990, 42 percent of faculty were far left or liberal (11 percentage points more than the American population), 40 percent were moderate, and 18 percent were far right or conservative, for a left-to-right ratio of 2.3 to 1. In 2014 the proportions were 60 percent far left or liberal (30 percentage points more than the population), 28 percent moderate, and 12 percent conservative, a ratio of 5 to 1.52 The proportions vary by field: departments of business, computer science, engineering, and health science are evenly split, while the humanities and social sciences are decidedly on the left: the proportion of conservatives is in the single digits, and they are outnumbered by Marxists two to one.53 Professors in the physical and biological sciences are in between, with few radicals and virtually no Marxists, but liberals outnumber conservatives by a wide margin.

  The liberal tilt of academia (and of journalism, commentary, and intellectual life) is in some ways natural.54 Intellectual inquiry is bound to challenge the status quo, which is never perfect. And verbally articulated propositions, intellectuals’ stock in trade, are more congenial to the deliberate policies typically favored by liberals than to the diffuse forms of social organization such as markets and traditional norms typically favored by conservatives.55 A liberal tilt is also, in moderation, desirable. Intellectual liberalism was at the forefront of many forms of progress that almost everyone has come to accept, such as democracy, social insurance, religious tolerance, the abolition of slavery and judicial torture, the decline of war, and the expansion of human and civil rights.56 In many ways we are (almost) all liberals now.57

  But we have seen that when a creed becomes attached to an in-group, the critical faculties of its members can be disabled, and there are reasons to think that has happened within swaths of academia.58 In The Blank Slate (updated in 2016) I showed how leftist politics had distorted the study of human nature, including sex, violence, gender, childrearing, personality, and intelligence. In a recent manifesto, Tetlock, together with the psychologists José Duarte, Jarret Crawford, Charlotta Stern, Jonathan Haidt, and Lee Jussim, documented the leftward swing of social psychology and showed how it has compromised the quality of research.59 Quoting John Stuart Mill—“He who knows only his own side of the case, knows little of that”—they called for greater political diversity in psychology, the version of diversity that matters the most (as opposed to the version commonly pursued, namely people who look different but think alike).60

  To the credit of academic psychology, Duarte et al.’s critique has been respectfully received.61 But the respect is far from universal. When the New York Times columnist Nicholas Kristof cited their article favorably and made similar points, the angry reaction confirmed their worst accusations (the most highly recommended comment was “You don’t diversify with idiots”).62 And a faction of academic culture composed of hard-left faculty, student activists, and an autonomous diversity bureaucracy (pejoratively called social justice warriors) has become aggressively illiberal. Anyone who disagrees with the assumption that racism is the cause of all problems is called a racist.63 Non-leftist speakers are frequently disinvited after protests or drowned out by jeering mobs.64 A student may be publicly shamed by her dean for a private email that considers both sides of a controversy.65 Professors are pressured to avoid lecturing on upsetting topics, and have been subjected to Stalinesque investigations for politically incorrect opinions.66 Often the repression veers into unintended comedy.67 A guideline for deans on how to identify “microaggressions” lists remarks such as “America is the land of opportunity” and “I believe the most qualified person should get the job.” Students mob and curse a professor who invited them to discuss a letter written by his wife suggesting that students chill out about Halloween costumes. A yoga course was canceled because yoga was deemed “cultural appropriation.” The comedians themselves are not amused: Jerry Seinfeld, Chris Rock, and Bill Maher, among others, are wary of performing at college campuses because inevitably some students will be enraged by a joke.68

  For all the follies on campus, we can’t let right-wing polemicists indulge in a bias bias and dismiss any idea they don’t like that comes out of a university. The academic archipelago embraces a vast sea of opinions, and it is committed to norms such as peer review, tenure, open debate, and the demand for citation and empirical evidence that are engineered to foster disinterested truth-seeking, however imperfectly they do so in practice. Colleges and universities have fostered the heterodox criticisms reviewed here and elsewhere, while delivering immense gifts of knowledge to the world.69 And it’s not as if alternative arenas—the blogosphere, the Twittersphere, cable news, talk radio, Congress—are paragons of objectivity and rigor.

  Of the two forms of politicization that are subverting reason today, the political is far more dangerous than the academic, for an obvious reason. It’s often quipped (no one knows who said it first) that academic debates are vicious because the stakes are so small.70 But in political debates the stakes are unlimited, including the future of the planet. Politicians, unlike professors, pull the levers of power. In 21st-century America, the control of Congress by a Republican Party that became synonymous with the extreme right has been pernicious, because it is so convinced of the righteousness of its cause and the evil of its rivals that it has undermined the institutions of democracy to get what it wants. The corruptions include gerrymandering, imposing voting restrictions designed to disenfranchise Democratic voters, encouraging unregulated donations from moneyed interests, blocking Supreme Court nominations until their party controls the presidency, shutting down the government when their maximal demands aren’t met, and unconditionally supporting Donald Trump over their own objections to his flagrantly antidemocratic impulses.71 Whatever differences in policy or philosophy divide the parties, the mechanisms of democratic deliberation should be sacrosanct. Their erosion, disproportionately by the right, has led many people, including a growing share of young Americans, to see democratic government as inherently dysfunctional and to become cynical about democracy itself.72

  Intellectual and political polarization feed each other. It’s harder to be a conservative intellectual when American conservative politics has become steadily more know-nothing, from Ronald Reagan to Dan Quayle to George W. Bush to Sarah Palin to Donald Trump.73 On the other side, the capture of the left by identity politicians, political correctness police, and social justice warriors creates an opening for loudmouths who brag of “telling it like it is.” A challenge of our era is how to foster an intellectual and political culture that is driven by reason rather than tribalism and mutual reaction.

  * * *

  Making reason the currency of our discourse begins with clarity about the centrality of reason itself.74 As I mentioned, many commentators are confused about it. The discovery of cognitive and emotional biases does not mean that “humans are irrational” and so there’s no point in trying to make our deliberations more rational. If humans were incapable of rationality, we could never have discovered the ways in which they were irrational,
because we would have no benchmark of rationality against which to assess human judgment, and no way to carry out the assessment. Humans may be vulnerable to bias and error, but clearly not all of us all the time, or no one would ever be entitled to say that humans are vulnerable to bias and error. The human brain is capable of reason, given the right circumstances; the problem is to identify those circumstances and put them more firmly in place.

  For the same reason, editorialists should retire the new cliché that we are in a “post-truth era” unless they can keep up a tone of scathing irony. The term is corrosive, because it implies that we should resign ourselves to propaganda and lies and just fight back with more of our own. We are not in a post-truth era. Mendacity, truth-shading, conspiracy theories, extraordinary popular delusions, and the madness of crowds are as old as our species, but so is the conviction that some ideas are right and others are wrong.75 The same decade that has seen the rise of pants-on-fire Trump and his reality-challenged followers has also seen the rise of a new ethic of fact-checking. Angie Holan, the editor of PolitiFact, a fact-checking project begun in 2007, noted:

  [Many of] today’s TV journalists . . . have picked up the torch of fact-checking and now grill candidates on issues of accuracy during live interviews. Most voters don’t think it’s biased to question people about whether their seemingly fact-based statements are accurate. Research published earlier this year by the American Press Institute showed that more than eight in 10 Americans have a positive view of political fact-checking.

 

‹ Prev