The Republican Brain
Page 6
Motivated reasoning thus helps to explain all manner of maddening, logically suspect maneuvers that people make when they’re in the middle of arguments so as to avoid changing their minds.
Consider one classic: goalpost shifting. This occurs when someone has made a clear and factually refutable claim, and staked a great deal on it—but once the claim meets its demise, the person demands some additional piece of evidence, or tweaks his or her views in some way so as to avoid having to give them up. That’s what the Seekers did when their prophecy failed; that’s what vaccine deniers do with each subsequent scientific discrediting of the idea that vaccines cause autism; that’s what the hardcore Birthers did when President Obama released his long-form birth certificate; that’s what the errant prophet Harold Camping did when his predicted rapture did not commence on May 21, 2011, and the world did not end on October 21, 2011.
In all of these cases, the individuals or groups involved had staked it all on a particular piece of information coming to light, or a particular event occurring. But when the evidence arrived and it contradicted their theories, they didn’t change their minds. They physically and emotionally couldn’t. Rather, they moved the goalposts.
Note, however, that only those who do not hold the irrational views in question see this behavior as suspect and illogical. The goalpost shifters probably don’t perceive what they are doing, or understand why it appears (to the rest of us) to be dishonest. This is also why we tend to perceive hypocrisy in others, not in ourselves.
Indeed, a very important motivated reasoning study documented precisely this: Democrats viewed a Republican presidential candidate as a flip-flopper or hypocrite when he changed positions, and vice versa. Yet each side was more willing to credit that his own party’s candidate had had an honest change in views.
The study in question was conducted by psychologist Drew Westen of Emory University (also the author of the much noted book The Political Brain) and his colleagues, and it’s path-breaking for at least two reasons. First, Westen studied the minds of strong political partisans when they were confronted with information that directly challenged their views during a contested election—Bush v. Kerry, 2004—a time when they were most likely to be highly emotional and biased. Second, Westen’s team used functional magnetic resonance imaging (fMRI) to scan the brains of these strong partisans, discovering which parts were active during motivated reasoning.
In Westen’s study, strong Democrats and strong Republicans were presented with “contradictions”: Cases in which a person was described as having said one thing, and then done the opposite. In some cases these were politically neutral contradictions—e.g., about Walter Cronkite—but in some cases they were alleged contradictions by the 2004 presidential candidates. Here are some examples, which are fairly close to reality but were actually constructed for the study:
George W. Bush: “First of all, Ken Lay is a supporter of mine. I love the man. I got to know Ken Lay years ago, and he has given generously to my campaign. When I’m President, I plan to run the government like a CEO runs a country. Ken Lay and Enron are a model of how I’ll do that.”
Contradictory: Mr. Bush now avoids any mention of Ken Lay and is critical of Enron when asked.
John Kerry: During the 1996 campaign, Kerry told a Boston Globe reporter that the Social Security system should be overhauled. He said Congress should consider raising the retirement age and means testing benefits. “I know it’s going to be unpopular,” he said. “But we have a generational responsibility to fix this problem.”
Contradictory: This year, on Meet the Press, Kerry pledged that he will never tax or cut benefits to seniors or raise the age for eligibility for Social Security.
Encountering these contradictions, the subjects were then asked to consider whether the “statements and actions are inconsistent with each other,” and to rate how much inconsistency (or, we might say, hypocrisy) they felt they’d seen. The result was predictable, but powerful: Republicans tended to see hypocrisy in Kerry (but not Bush), and Democrats tended to see the opposite. Both groups, though, were much more in agreement about whether they’d seen hypocrisy in politically neutral figures.
This study also provides our first tantalizing piece of evidence that Republicans may be more biased, overall, in defense of their political beliefs or their party. While members of both groups in the study saw more hypocrisy or contradiction in the candidate they opposed, Democrats were more likely to see hypocrisy in their own candidate, Kerry, as well. But Republicans were less likely to see it in Bush. Thus, the authors concluded that Republicans showed “a small but significant tendency to reason to more biased conclusions regarding Bush than Democrats did toward Kerry.”
While all this was happening, the research subjects were also having their brains scanned. Sure enough, the results showed that when engaged in biased political reasoning, partisans were not using parts of the brain associated with “cold,” logical thinking. Rather, they were using a variety of regions associated with emotional processing and psychological defense. Instead of listing all the regions here—there are too many, you’d be drowning in words like “ventral”—let me instead underscore the key conclusion.
Westen captured the activation of what appeared to be emotionally oriented brain circuits when subjects were faced with a logical contradiction that activated their partisan impulses. He did not capture calm, rational deliberation. These people weren’t solving math problems. They were committing the mental equivalent of beating their chests.
Notes
26 “A man with a conviction . . .” My account of the Seekers is based on Festinger’s classic book (with Henry W. Riecken and Stanley Schacter), When Prophecy Fails, first published by the University of Minnesota Press in 1956. My edition is published by Pinter& Martin, 2008. All quotations are from this text.
28 how smokers rationalize For a highly readable overview of “cognitive dissonance” theory and the many different phenomena it explains, see Carol Tavris and Elliot Aronson, Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, New York: Houghton Mifflin Harcourt, 2007. The smoking example is provide by Aronson in his foreword to When Prophecy Fails, Pinter & Martin, 2008.
29 motivated reasoning For an overview see Ziva Kunda, “The Case for Motivated Reasoning,” Psychological Bulletin, November 1990, Vol. 108, No. 3, pp. 480–498.
29 Thinking and reasoning are actually suffused with emotion See Antonio Damasio, Descartes’ Error: Emotion, Reason, and the Human Brain, New York: Putnam, 1994, and Joseph LeDoux, The Emotional Brain, New York: Simon & Schuster, 1996.
29 about 2 percent George Lakoff, The Political Mind, New York: Penguin, 2008, p. 9.
29 classic 1979 experiment Lord, Ross & Lepper, “Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence,” Journal of Personality and Social Psychology, 1979, Vol. 37, No. 11, p. 2098–2109.
29 affirmative action and gun control Taber & Lodge, “Motivated Skepticism in the Evaluation of Political Beliefs,” American Journal of Political Science, Vol. 50, Number 3, July 2006, pp. 755–769.
30 the accuracy of gay stereotypes Munro & Ditto, “Biased Assimilation, Attitude Polarization, and Affect in Reactions to Stereotype-Relevant Scientific Information,” Personality and Social Psychology Bulletin, June 1997, Vol. 23, No. 6, p. 636–653.
30 “confederation of systems” Jonathan D. Cohen, “The Vulcanization of the Human Brain: A Neural Perspective on Interactions Between Cognition and Emotion,” Journal of Economic Perspectives, Vol. 19, No. 4, Fall 2005, p. 3–24.
30 closely related to those that we find in other animals See Joseph LeDoux, The Emotional Brain, New York: Simon & Schuster, 1996.
30 somewhere in Africa “Homo sapiens,” Institute on Human Origins, available online at http://www.becominghuman.org/node/homo-sapiens-0.
30 fast enough to detect with an EEG device Milton Lodge and Charles Taber, T
he Rationalizing Voter, unpublished manuscript shared by authors.
31 “natural selection basically didn’t trust us” Interview with Aaron Sell, August 12, 2011.
31 control system to coordinate brain operations Leda Cosmides & John Tooby, “Evolutionary Psychology and the Emotions,” Handbook of Emotions, 2nd Edition, M. Lewis & J.M. Haviland Jones, Eds. New York: Guilford, 2000.
31 “primacy of affect” R.B. Zajonc, “Feeling and Thinking: Preferences Need No Inferences,” American Psychologist, February 1980, Vol. 35, No. 2, pp. 151–175.
31 spreading activation Milton Lodge and Charles Taber, The Rationalizing Voter, unpublished manuscript shared by authors.
32 “They retrieve thoughts that are consistent with their previous beliefs” Interview with Charles Taber and Milton Lodge, February 3, 2011.
32 we’re actually being lawyers Jonathan Haidt, “The Emotional Dog and Its Rational Tail: A Social Intuitionist Approach to Moral Judgment,” Psychological Review, 2001, Vol. 108, No. 4, 814–834.
32 “confirmation bias” For an overview, see Raymond S. Nickerson, “The Confirmation Bias: A Ubiquitous Phenomenon in Many Guises,” Review of General Psychology, 1998, Vol. 2, No. 2, p. 175–220.
32 “disconfirmation bias” Taber & Lodge, “Motivated Skepticism in the Evaluation of Political Beliefs,” American Journal of Political Science, Vol. 50, Number 3, July 2006, pp. 755–769.
33 “a person who claimed that he had won the race” Paul Bloom & Deena Skolnick Weisberg, “Childhood Origins of Adult Resistance to Science,” Science, May 18, 2007, Vol. 316, pp. 996–997.
33 either heavy metal or country Paul A. Klaczynski, “Bias in Adolescents’ Everyday Reasoning and Its Relationship With Intellectual Ability, Personal Theories, and Self-Serving Motivation,” Developmental Psychology, 1997, Vol. 33, No. 2, pp. 273–283.
35 “At least by late adolescence. . .” Paul A. Klaczynski and Gayathri Narasimham, “Development of Scientific Reasoning Biases: Cognitive Versus Ego-Protective Explanations,” Developmental Psychology, 1998, Vol. 34, No. 1, 175–187.
35 our groups For the role of group affiliation in identity-protective cognition, and an overview of motivated reasoning generally and how it operates in a legal context, see Dan M. Kahan, “The Supreme Court 2010 Term—Foreword: Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law,” 125 Harvard Law Review, p. 1–77.
36 the more powerful it becomes George Lakoff, The Political Mind: A Cognitive Scientist’s Guide to Your Brain and Its Politics, New York: Penguin, 2008.
36 “change brains” George Lakoff, The Political Mind, New York: Penguin, 2008.
40 Drew Westen Drew Westen et al, “Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election, Journal of Cognitive Neuroscience, Vol. 18, No. 11, pp. 1947–1958.
Chapter Two
Smart Idiots
I’m convinced that in most cases in which people (especially today’s political conservatives) deny inconvenient facts, resist contrary evidence, and sometimes come up with elaborate counterarguments, motivated reasoning is a key part of the process. In other words, it is all around us. Our political discourse is choking on it—even though very few of us seem to notice or admit it.
One reason for this is that while the arguments we hear may be impelled by automatic emotional reactions, that doesn’t make them any less clever-sounding or persuasive. Some can be crafty indeed. And that’s perhaps never more true than when they become technical and involve “expertise.”
In debates over scientific or technical matters with partisan implications—is global warming happening, did Iraq have weapons of mass destruction, and so on—the same game recurs. Let’s call it “My expert is better than yours.” It’s very simple: In a dispute where neither participant is actually an expert, the two debaters cite different experts, with different views, to bolster their beliefs. Both believe their expert is right and reliable, and that the other guy’s isn’t.
Motivated reasoning explains this phenomenon too. According to intriguing research by Yale Law professor Dan Kahan and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict who they consider to be a legitimate scientific expert in the first place—and where they consider “scientific consensus” to lie on contested issues. These same views also lead them to reject the expertise of experts who don’t agree with them. They simply assume they’re not really experts at all.
In Kahan’s research individuals are classified, based on their political and moral values, as either individualists or communitarians, and as either hierarchical in outlook or egalitarian. To conceptualize this, picture a simple Cartesian plane with two axes, of the sort that we all remember from algebra class. One axis runs from very hierarchical in outlook (believing that society should be highly structured and ordered, including based on gender, class, and racial differences) to very egalitarian in outlook (the opposite). The other runs from very individualistic in outlook (believing that we all are responsible for our own fates in life and people should be rewarded for their choices and punished for their faults, and that government should not step in to prevent this) to very communitarian in outlook (the opposite).
This creates four ideological quadrants, with each of us located in one of them. And though sometimes the picture grows more complicated, broadly speaking, hierarchical-individuals correspond to U.S. conservatives, whereas egalitarian-communitarians correspond to U.S. liberals. The two groups will largely be found occupying different quadrants—although in reality, individuals are scattered all over the place and may change quadrants depending on the issue at hand.
In the next section, I will say more about Kahan’s scheme—and others—that divide up the political parties based on their followers’ cultural values or moral systems. For now, though, let’s survey the consequences that divisions like these have for how we understand science and facts.
In one of Kahan’s studies, members of the different groups were asked to imagine that a close friend has come to them and said that he or she is trying to decide about the risks on three contested issues: whether global warming is caused by human beings, whether nuclear waste can be safely stored deep underground, and whether letting people carry guns either deters violent crime on the one hand, or worsens it on the other. The experiment continued:
The friend tells you that he or she is planning to read a book about the issue but before taking the time to do so would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.
Then study subjects were shown alleged book excerpts by fake “experts” on these issues, as well as phony pictures of the authors and fictitious resumes. All the authors were depicted as legitimate experts and members of the National Academy of Sciences. The only area where they differed was on their view of the risk in question.
The results were stark: When the fake scientist’s position stated that global warming is real and caused by humans, only 23 percent of hierarchical-individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian-communitarians accepted the same scientist’s alleged expertise. (Similar divides, although not always as sharp, were observed on the other issues.)
In other words, people were rejecting the scientific source because its conclusion was contrary to their deeply held views about the world. None of the groups were “anti-science” or “anti-expert”—not in their own minds, anyway. It’s just that science and expertise were whatever they wanted them to be—whatever made them feel that their convictions had been bolstered and strengthened.
When they deny global warming, then, conservatives think the best minds are actually on their side. They think they’re the champions of truth and reality, and they’re deeply attached to this view. That is why head-on attempts to persuade them otherwise usually fail. Indeed, factu
al counterarguments sometimes even trigger what has been termed a backfire effect: Those with strongly held but clearly incorrect beliefs not only fail to change their minds, but hold their wrong views more tenaciously after being shown contradictory evidence or a refutation.
To show this, let’s move from global warming to a question that, from the perspective of the political mind, is very similar: whether Saddam Hussein’s Iraq possessed hidden weapons of mass destruction prior to the U.S. invasion in 2003. When political scientists Brendan Nyhan of Dartmouth and Jason Reifler of Georgia State showed subjects fake newspaper articles in which this incorrect claim was first suggested (in a real-life 2004 quotation from President Bush) and then refuted (with a discussion of the actual findings of the 2004 Duelfer report, which found no evidence of concerted nuclear, chemical, or biological weapons efforts in pre-invasion Iraq), they found that conservatives were more likely to believe the claim than before.
The same thing happened in another experiment, when conservatives were primed with a ridiculous (and also real) statement by Bush concerning his tax cuts—“the tax relief stimulated economic vitality and growth and it has helped increase revenues to the Treasury.” The article then went on to inform study subjects that the tax cuts had not actually increased government revenue. Once again, following the factual correction, conservatives believed Bush’s false claim more strongly.
Seeking to be evenhanded, the researchers then tested how liberals responded when shown, in a similar format, that despite some Democratic claims, George W. Bush did not actually “ban” embryonic stem cell research. And it’s true: Bush merely restricted government funding to research on a limited number of stem cell lines, while leaving research completely unregulated in the private sector. Liberals weren’t particularly amenable to persuasion in the experiment either—but unlike conservatives, they did not “backfire.” Perhaps they were less defensive about the matter, less wedded to the notion of a “ban.” Perhaps whether or not it was technically a ban, they still felt Bush’s limits on stem cell research were a bad policy.