Saving Us
Page 6
“I was on your Twitter,” said my science teacher dad, who has the good sense not to have a Twitter account. “Why do you bother responding to those idiots?”
I told him it’s important to counter disinformation. When these arguments occur publicly, there are others listening in who need to know that, as scientists, we have heard these zombie objections many times and we have good responses to them.
But in another sense, my dad was right. The positive feedback I’ve received for trying to respond with accurate data is sparse, though valued. “You dragged my sorry denier’s ass to the truth,” confessed one fellow Christian in a memorable tweet, and “You changed my dad’s mind—thank you!” said another, privately. But such highlights are usually overwhelmed by the negative attacks from those who are unable to look at, let alone consider, something they see as a challenge to their identity.
“There’s no convincing proof climate change is man-made,” said one man. I replied with a thread I’d created earlier that systematically goes through each natural cause of climate change and shows why it can’t be responsible for the current warming. “Insults are not proof,” he responded, “keep trying, ‘Professor’ ”—making it clear that he perceives facts to be attacks and expertise to be legitimate only if he agrees with what you say.
Of all confrontational people I’ve responded to with carefully marshaled and fully cited science, only a tiny handful have ever taken the time to engage in a thoughtful and honest way. So why aren’t all these facts working to change people’s minds? And if the facts are more accessible today than ever, why are so many people getting them so wrong?
PICKING YOUR OWN FACTS
Often, like my dad said, we think it’s because people who disagree with us are idiots. There’s even a book about it: I’m Right and You’re an Idiot, by Canadian public relations expert Jim Hoggan. But his subtitle, The Toxic State of Public Discourse and How to Clean it Up, explains how the first assumption we jump to is a big part of the problem.
The vast majority of us understand that science and facts explain the way the world works, and that we ignore them at our peril. We all know that if someone says, “gravity isn’t real” and steps off a cliff, they’re going down whether they “believe” in it or not. So it’s reasonable that individuals or institutions that want to change the way people think apply the knowledge deficit model: the idea that, if people disagree with some fact or scientific explanation, it’s because they don’t know enough. If that’s true, then by implication, more information—better education, clearer explanations—will prevent people from making misleading claims about climate change.
This approach can work if we’re talking about issues that don’t have any moral or political baggage attached to them, like black holes or insect behavior. It can also work if we’re talking about an issue that doesn’t require immediate action, like astronomers’ warning of a comet that might approach the Earth too close for comfort a century from now. But when politics, ideology, identity, and morality get tangled up in science—when our frames, as George Lakoff calls them, get in the way—then all bets are off. And what if that science implies that urgent and widespread action is needed? That’s when the gloves come off, too.
Social scientist Dan Kahan has developed a measure he called “ordinary science intelligence.” It measures how capable people are of understanding data, statistics, probabilities, and scientific results.III He then asked people whether they agreed that climate was changing due to human activities. He found only a weak correlation between science intelligence and the chance of a positive response. People with the lowest science intelligence had about a 35 percent chance of answering yes. People with the highest science intelligence had about a 60 percent chance of answering yes. But if he divided people by political affiliation, he found something quite different. Of these scientifically savvy people who identified as liberal Democrats, more than 90 percent were likely to answer the question positively. If they identified as conservative Republicans they had a greater than 90 percent chance of answering it negatively.
Disturbingly, Kahan and colleagues found that “people with the highest degree of scientific literacy [that is, who were best able to understand science] were not most concerned about climate change. Rather, they were the ones among whom cultural polarization was greatest.” In other words, the more broadly scientifically literate you are, the more, not less, likely it is that your political identity dictates your opinions on polarized issues like climate change.
Although Kahan’s work was done in the U.S., a more recent study of people in sixty-four different countries found a similar trend. In other developed countries, education tends to make people more concerned about climate change, but this effect was noticeably diminished if they also identified as politically conservative. In developing countries and emerging economies, on the other hand, education made everyone more concerned about climate change regardless of their political affiliation.
It turns out that being better able to handle quantitative information and understand science in general doesn’t make you more accepting of thorny, politically polarized scientific topics with moral implications that require a response; it just makes you better able to cherry-pick the information you need to validate what you already believe. For most of us, when it comes to climate change, we already have opinions. And the smarter we are, the harder we’ll try to out-argue the devil if he disagrees with us. The term for this is motivated reasoning: an emotionally driven process of selecting and processing information with the goal of confirming what you already believe rather than informing your opinions or perspective. Which may beg the question—why did you pick up this book?
MOTIVATED REASONING
Basing our opinions and judgments on reason rather than emotion is the lofty goal laid out by Greek philosophers. It continues to be pursued by scientists today. But Plato might be disappointed to learn that modern psychology strongly suggests that when it comes to making up our minds about something, emotions usually come first and reason second. If we’ve already formed our opinions, more information will get filtered through those pre-existing frames. And the more closely that frame is tied to our sense of what makes us a good person, the more tightly we’ll cling to it and let potentially opposing facts pass us by. As Jonathan Haidt explains in The Righteous Mind, “People make moral judgements quickly and emotionally.… We do moral reasoning not to reconstruct the actual reasons why we ourselves came to a judgement: we reason to find the best possible reasons why somebody else ought to join us in our judgement.”
You might be thinking this primarily applies to people who don’t care about climate change, but it actually cuts both ways. If we’re worried about climate change, when we hear or see or read more bad news, it confirms what we already believe. The news essentially tells us, “See? You’re right! This is real, and it’s bad! It isn’t just you saying this. It’s these NASA scientists, and by extension, the Greenland ice sheet itself.” This confirms what we already thought and reinforces our conviction that we are good people for thinking that. By extension, it also affirms that those who disagree with us are wrong.
When we want to believe something, psychologist Thomas Gilovich says, we ask ourselves “Can I believe it?” and we search for supporting evidence. When we don’t want to believe something, we ask “Must I believe it?” and we search for contrary evidence. We all engage in this type of motivated reasoning when our identity is on the line, even when the stakes are relatively low. The smarter we are, the better we are—or so the social science warns us—at using motivated reasoning to defend our opinions and preserve our self-worth and identity.
For example, I grew up with the idea that you do not waste food. So if there isn’t mold on it, I eat it (and if it’s cheese, I just scrape the mold off and still eat it). My husband grew up with the belief that old food is bad food, and it will make you sick. And between two academics, it turns out you can find peer-reviewed articles supporting both eatin
g food past its deadline and throwing it out. But there is a lot of motivated reasoning involved in our discussion, because “Waste not, want not” is part of my identity. On his part, whenever I feel sick, the first thing he asks suspiciously is “Did you eat those leftovers in the fridge from last week?”
Sometimes, though, all of us engage in motivated reasoning with higher stakes. In bigger decisions regarding parenting or religion, for example, our strong emotional attachment to a given position causes us to hold to our pre-existing opinion in the face of significant opposition and even solid fact. We will use all the intelligence we have to show why we’re right, rather than admit we’re wrong.
I fight this tendency in myself as a scientist, all the time. When you’ve invested years of work into something, it can be scary to imagine it might be wrong or off base. That’s why I decided to join an international team a few years ago to re-analyze a handful of recent scientific studies that concluded that either the planet was not warming or that humans were not responsible. What if one of them had a point?
Our team leader, the Norwegian climate scientist Rasmus Benestad, collected the studies. There were thirty-eight in all, compared to the thousands that have been published over the same decade showing that the planet is warming and humans are responsible. He dismantled and recalculated every single one of those studies from scratch. The rest of us followed behind, checking his work. And what we concluded was astounding. In every single analysis, we found evidence of motivated reasoning; not ours, but the authors’. Some neglected important factors; others made assumptions that were incorrect; some had basic arithmetic or scientific errors that should have been detected when the results turned out to be so contradictory, but weren’t. Rather than exposing our motivated reasoning as scientists, it turned out that those rejecting the science were the ones willing to overlook information in pursuit of their goal.
The authors of those studies hadn’t been up front about their motives, but sometimes people can be remarkably honest about this. At a workshop on how climate change affects agriculture in Texas, one farmer came up to me afterward, shaking his head. “Everything you said makes sense, and I’d like to agree with you,” he confessed. “But if I agree with you, I have to agree with Al Gore, and I could never do that.”
As Peter Boghossian and James Lindsay explain in How to Have Impossible Conversations, “think of every conversation as being three conversations at once: about facts, feelings, and identity.” I thought I was having a conversation about farming and water; but we were also talking about how we felt about climate change, and about how we saw ourselves in relation to it. “It might appear that the conversation is about facts and ideas,” these authors continue, “but you’re inevitably having a discussion about morality, and that, in turn, is inevitably a discussion about what it means to be a good or bad person.” The farmer had listened to what I’d said and given it a fair shot, and he even agreed with it—logically. But he realized that he’d have to give up his moral judgment to accept this new information. It just wasn’t worth it.
HOW FACTS CAN BACKFIRE
In the most extreme cases, when people have already constructed their sense of identity around rejecting so-called liberal solutions to climate change, you can see how bringing up scientific facts can come across as a personal attack on their identity—or an “insult,” as my Twitter antagonist termed it. If rejecting climate change is part of what we believe makes us a good person, then we don’t interpret arguments to the contrary as “you’re wrong.” Rather, we hear them saying “you’re a bad person.” And no one likes to hear that. It tends to make us double down on our denial in a kind of backfire effect.
This backfire effect happened in real life, and on cable television, to my friend Anna Jane Joyner. She’s a Christian, like me, and a climate activist. Anna Jane’s dad, Rick, is a pastor of a conservative megachurch in the southern U.S. He rejects the reality of climate change—and the need for climate action—based on a suite of politically conservative views consistent with those in his tribe: other U.S. white “evangelical” leaders, Republican politicians, right-wing news media pundits, and more.
In 2014 the writers of the Emmy award–winning climate change documentary series Years of Living Dangerously figured that the dynamic between Anna Jane and her dad would make for great TV. They brought in actor Ian Somerhalder to interview them about how they’d argued about this over the years—including a six-month period when they weren’t speaking to each other. The writers also brought in a climate scientist who’s a Christian (me) and a former Republican congressman, Bob Inglis. Bob used to reject climate change himself, but he’d been convinced by his own son that climate change was real and dangerous. Since then, he’d gone on to found republicEn, an organization that advocates for free market solutions to climate change.
Bob and I presented our best arguments. We responded to all of Rick’s “But what about…?” and “Gotcha!” questions. We even visited oyster fishermen in Apalachicola Bay, near the extended Joyner family’s coastal home, to see firsthand the impacts of a warming ocean on people with no particular political axe to grind. The fishermen were Republicans themselves, just trying to get by. They were worried about how oyster catches were dropping as the oceans warmed, sea level rose, and freshwater inflows declined.
Anna Jane’s dad is a smart man. In addition to being the head of a large and successful organization, he is a pilot who understands weather nearly as well as a local meteorologist. And he’s also a Dismissive. So what do you think happened as we spoke? Thanks to the social science, you might be able to guess. All of this meant he was better at motivated reasoning and more likely to be polarized by additional information than the average person, rather than less. And that’s exactly what happened.
The more we spoke, the more his rejection hardened. You could see it happen in real time over the course of the episode. He probably felt ganged up on, and I could understand why. He definitely felt that his identity, not his opinions, were being challenged and judged. Unfortunately, the result was to drive Anna Jane’s dad even further away, and today his denial is stronger than ever. The same zombie arguments Bob and I responded to back then continue to be hauled out and re-aired at family gatherings, in group text conversations and phone calls.
And it’s not entirely his fault, either. It’s the way our brains work.
COGNITIVE MISERLINESS AND INFORMATION OVERLOAD
There’s so much information available in the world today that there is simply no way our brains can contain everything we need to know. There’s a term for this: nearly all of us are cognitive misers. In other words, we look for solutions that take the least thought. And to do that, we often rely on what others think.
I don’t know about you, but I don’t have enough time and energy to develop and maintain a deep expertise about the nuances of immigration policies, CRISPR gene editing, the latest testimony delivered in front of the U.S. House Judiciary Committee, and what the Canadian prime minister said in his throne speech. I do have a wide array of opinions on such matters, however. I’ve developed these by listening to my friends and family and colleagues, journalists and podcasters, people and sources whom I trust have spent the time to learn about these issues. And, just like the rest of us, where these trusted figures stand on such issues is often directly linked to their political leanings.
So when it comes to highly polarized topics, as cognitive misers we lean toward accepting the opinions of people we trust, people who share our values. We have an incentive to adopt our tribe’s beliefs and opinions, as we are rewarded for doing so. Our reward is both social, through a sense of acceptance and community, and psychological, in that we don’t have to research this topic ourselves. Agreeing with others gives us a reassuring sense of certainty, security, and belonging in a world that increasingly seems to be too big and moving too fast. For most of us, the value of belonging far outweighs the value of attaining new information, especially if publicly accepting that in
formation and speaking up might lead to a negative outcome—an argument, the cold shoulder, or even ostracism from your social group. And when we’re exposed to information we disagree with—as in one study where researchers looked at how people’s attitudes hardened when Democrats on Twitter read a set of conservative tweets and Republicans a set of liberal ones—we tend to double down on our previous beliefs rather than re-examine them.
Neuroscientist Tali Sharot explains in her book The Influential Mind that our brains are programmed to “get a kick out of information.” But, she goes on, “the tsunami of information we are receiving today can make us even less sensitive to data because we’ve become accustomed to finding support for absolutely anything we want to believe, with a simple click of the mouse.” If we give people new information that contradicts their frame, what they believe, and what their tribe adheres to, their brains just turn off. Even worse, she says, “because we are often exposed to contradicting information and opinions, this tendency will generate polarization, which will expand with time as people receive more and more information.”
As consistent as this is with my own experience, I was still utterly horrified as I absorbed Sharot’s no-nonsense explanation of how our brains work. Why? Because it suggests that the facts I shared with Anna Jane’s dad, and that I share daily on social media, all the information you may share with your family or people you know—even all the climate studies scientists keep publishing with more and more such facts in them—all of these may actually be contributing to the polarization of beliefs about climate change rather than helping to dispel it. Yikes!