Book Read Free

Know-It-All Society

Page 6

by Michael P. Lynch


  Because they reflect our self-identities, our convictions carry authority over our lives. Most obviously, they have authority over our actions; they obligate us to do some things and grant us permission to do others. A religious conviction, for example, can give believers the moral permission to blow themselves up, or cause them to engage in nonviolent protest in support of civil rights. Even a personal conviction can play this role—by excusing us, for example, from other moral demands. If one of your convictions is to put family before work, then it will make sense for you to skip a late meeting to make it to your kid’s soccer game. Or, if you missed the last one, you might feel obligated to make the next one. We may not live up to such obligations, but we feel them just the same.

  But convictions don’t carry just moral authority. They carry authority over what we believe. Once something becomes a real conviction, it is difficult for us to doubt; it becomes part of our form of life.

  Wittgenstein argued that certain propositions are like the hinges on which our worldview turns. On the surface, they can seem rather like any other proposition; they aren’t special, mind-bending truths like Descartes’s “I think, therefore I am,” and thus they aren’t certain in the logical sense. But we find it hard to really doubt such hinges anyway, because then the world would stop making sense. Consider, to take one of Wittgenstein’s favorite examples, the proposition that I have hands. This could be false; it isn’t like the thought that 2 and 2 make 4. But doubting that I have hands is not something I can ordinarily do with much seriousness. Anything I marshal to defeat the proposition will not be more certain than it is. Thus, attempts to convince me that I don’t have hands, or that my name is not really Michael, would be met with extreme skepticism. Even if the evidence presented to me were very good (for example, evidence that what I thought was my birth certificate was a forgery), I might be inclined to think that the evidence, rather than my belief, was flawed. As Wittgenstein says, “Here a doubt would seem to drag everything with it and plunge it into chaos.”16

  Although they function in some ways like Wittgenstein’s hinge propositions, convictions aren’t generally about mundane matters that we typically take for granted or just believe without thinking about it. They are commitments. But we do hold convictions fixed, and we are willing to make all sorts of sacrifices on their behalf. We often are willing to explain away contrary evidence, even if doing so flies in the face of the facts or logic itself. And we do that precisely because of the authority we give convictions over our life by virtue of their connection to our self-identity. That’s why I am so reluctant to give them up, and why I may feel bad or guilty for not having the courage to live up to them. It is because they are commitments central to my self-identity that giving up a conviction can feel like an act of self-betrayal and a betrayal of one’s tribe. And the tribe may well agree: if you don’t share your tribe’s convictions, you aren’t a true believer, and that may mean having your membership privileges revoked.17

  These facts are why it is so hard to be open-minded about our convictions, and why a challenge to our convictions can feel almost like an assault—an attack “on our way of life.” Of course, we don’t necessarily expect that people will agree with our convictions. But we do typically expect people to respect them, and to acknowledge our right to have them. These facts, in turn, explain why we are often unwilling to “go there”—to even discuss our convictions, as opposed to those beliefs that we think are true but we don’t associate with our identity. Defending our convictions seems like defending our identity itself.

  From Belief to Conviction

  Friedrich Nietzsche, who probably thought more about convictions than any other philosopher, understood something about convictions that is easy to overlook, yet crucial for understanding how they become the unwitting engines of arrogance. “Every conviction has its history—its pre-formations, its ventures, its mistakes,” Nietzsche wrote; “it becomes a conviction only after not being one.”18 What starts off as a simple belief, in other words, can, given the right circumstances, take on all the pomp of a deep, identity-reflecting value. And when that happens, the authority that the conviction brings with it—just by virtue of being a conviction—means we may shield ourselves from evidence that may seem to undermine it.

  Nietzsche’s point that convictions have histories as nonconvictions helps to explain a particularly weird feature of much of our public discourse right now—namely, that almost everything can suddenly turn political. What you eat, the kind of car you drive, and even where you buy your coffee are now commonly seen as politically motivated choices. Thanks to the politicization of climate change, even the weather is not off-limits. During the late summer of 2017, under threat from an impending hurricane, the governor of North Carolina ordered the evacuation of coastal areas in the storm’s path. Certain conservative talk show hosts declared such hurricane hysteria to be part of a plot to bolster the climate change hoax. Irrational? Absolutely. But such irrationality is now sadly familiar. And much like claims that school-shooting survivors are actors, such wild accusations don’t just come out of nowhere. In part, they reflect the growing sense that issues that were once politics-free or off-limits are now seen through partisan lenses. And that shift in perception often happens precisely because what we might call straightforward “matters of fact”—like whether a hurricane is threatening the coast, or whether Dunkin Donuts has “conservative” coffee—have been turned into matters of conviction. We’ve made them personal.

  A “matter of fact” is a question that, at least in principle, can be decided by empirical means. Public debates often hinge on exactly such matters: whether a bridge project will cost this much or that much, whether more regulation will deter pollution, whether increased patrols in a neighborhood will lower the crime rate. These are the bread-and-butter issues of political policy, and they are not overtly philosophical matters. They are empirical ones, and we typically hope to bring to bear the tools of science to decide them. In the ideal world, we run the studies and then rely on the best view of the facts we have at the time to decide which way to go. But this ideal is often not realized. The process gets politicized, in at least one of two ways. One obvious means of politicizing an issue is to disagree about cost, and to engage in the usual and somewhat tiresome debate over government spending. Even in the face of such politicization, however, there is often widespread agreement about the facts, or at least about how they are to be investigated. The issue is really one about how best to construct policy in light of the facts.

  But there is another way debates over matters of fact get politicized: they turn into matters of conviction. Consider the debate over climate change. On the face of it, questions about the causes of climate change are much like questions about the causes of damage to a bridge. But Americans do not approach the matter in the same way, as is indicated by research indicating that a person’s political affiliations are highly predictive of an inclination to believe that climate change is a significant threat.19 Conservatives are apt to think not; liberals are apt to think yes. Moreover, our political identities determine the extent to which we reject evidence that contradicts our views on climate change, or whether we are willing to accept even bad evidence that supports our worldview.

  And that phenomenon is hardly contained to climate change. Would banning assault rifles reduce gun violence? Are vaccines safe? On all these issues, Americans are demonstrably prone to react to any data on the subject in a way that reflects their political viewpoint.

  There are various ways to think about this problem. One might be that religiously influenced political viewpoints discourage scientific literacy. As a result, people reject climate change because they don’t understand science in general. Solution: teach more science. Another is that some political approaches encourage close-mindedness and poor critical thinking skills more than others do. Solution: teach people logic.

  Both of these suggestions have merit. There is good reason to think that people can, over t
ime, become both more scientifically literate and better critical thinkers, at least in certain respects. Appreciating science is often more of a matter of coming to see that it is important to understand how something works, rather than to understand an abstract theory, for example. And we are more prone to think that it matters how something works when we can understand the stakes—such as the stakes of climate change. Moreover, people can, again over time, change their minds on matters of conviction—and reason can play a part. A good example on this score is people’s attitudes on gay marriage. The shift in attitudes over the last decade is due to multiple factors. But one factor is the dramatic way in which court challenges to gay marriage have continually flopped—and flopped because of the inability of those challenging gay marriage to cite any evidence that it is harmful. The failure of the opponents of gay marriage here was, in this case, a rational and epistemic failure, and it arguably played a part in the larger cultural shift over time.20

  Teaching critical thinking and science, however, is playing the long game. And on that score it is probably the most important approach. But in the short term these strategies often run smack into the fact that humans are prone to turn matters of fact into matters of conviction. And when that happens, we engage in identity-protective reasoning. That’s because our convictions are part of our self-identity, our conception of our deepest self. And for that reason, the authority of conviction makes it very unlikely that teaching people more logic and science will help change their minds in the short run.

  Recent research by Yale psychologist Dan Kahan and his colleagues makes the point vivid.21 Kahan’s work suggests that, contrary to common sense, the more scientifically literate and cognitively sharp you happen to be, the more polarized you often are—no matter what your political affiliation. Logically minded, scientifically literate conservatives are even more prone to dig in and defend their rejection of scientific studies supporting human involvement in climate change in the face of countervailing evidence. And the same is true of liberals opposed to vaccinations. The more you know, the more resistant you are to changing your mind. That stubbornness may be due in part to the fact that people who are more scientifically literate are more skilled at poking holes in studies they don’t agree with, and at reading those they do agree with in the best possible light. But Kahan and others argue that this finding simply reflects a more general lesson: that we are prone to accept information when it affirms some aspect of our self-identity and to reject it when we perceive it as threatening that identity. And in one sense, this approach is practically rational.

  Our understanding of the nature of conviction shows us that the point is entirely general. When we allow some matter of fact to become a matter of conviction—such as the human contributions to climate change, or the impact of gun control legislation on suicide rates, or the safety of vaccines—our commitments on these matters take on certain kinds of authority over our life. That’s part of what makes them convictions. For that reason, it can become practically rational to ignore evidence that might undermine them. Convictions make it practically rational to be epistemically irrational.

  All this still leaves open the question of what causes us to turn matters of fact into matters of conviction in the first place. What makes us commit so deeply to a view about climate change or tax policy or veganism that we give it authority over our actions and beliefs?

  One explanation, sometimes floated by Kahan, is peer pressure. Perhaps we turn ordinary beliefs into convictions when many people in our cohort start treating them that way. They become tribal badges of honor.22 If you live in a community with mostly climate-change deniers—and more important, people for whom climate-change denial is a matter of conviction—it will be highly uncomfortable for you to be open to the reality of climate change. There will be social risks if you fight the current, and social rewards if you also deny climate change. So, there are powerful—if unconsciously operative—practical reasons for you to adopt climate-change skepticism as well.23

  This explanation is a good start, but on its own it is neither necessary nor sufficient. It is not sufficient because peer pressure, when it is effective, also works at the level of mere belief in simple matters of fact. If my friends believe that the restaurant we’re going to is open, I may on that basis adopt that belief as well, even if at first I doubt it. The same principle works for desires. The best way to sell something—to make people want to buy it—is to convince them that their peers want it too. So, while peer pressure and tribal cohesion explain why it is practically rational to want and believe what our peers want and believe, they don’t explain how we turn a mere factual belief into a commitment that reflects our identity. And the fact that our cohort has a particular belief isn’t necessary for a belief to become a conviction either. Otherwise we would never find examples of people who stood their ground against their peers. But of course we do.

  To understand why beliefs become convictions, we need to remember that they are commitments to what matters to us; they concern our values. Some of them are explicitly like that; they are explicitly about what is right and wrong. But that’s not the only way we can acquire a commitment that we own as part of our identity. Straightforward matters of fact can become morally entangled. Matters of fact are the sorts of things we could assess, in some cases with scientific investigation, and in other cases with historical or legal or economic investigations. But as we’ve seen, certain matters of fact can be treated by people on both sides of the political spectrum as something that must be true (or false). When someone insists, for example, that climate change must be a hoax—or Trump must have personally conspired with Putin to get elected, the “must” is a signal that we have let a straightforward matter of fact become morally entangled.

  Moral entanglement happens when one becomes committed to a belief in a matter of fact because its truth—rightly or wrongly—is regarded as evidentially related to a moral commitment, in the following sense: its falsity would undermine the perceived evidence for that moral commitment. When that happens, a seemingly straightforward claim about physical events has become shot through with our moral values. Thus, the empirical belief takes on moral salience from the explicit moral values around it, and any attack on it is treated as an attack on those values.

  This is hardly a new phenomenon. Consider, by way of example, the nineteenth-century “scientific” belief in the differences in skulls between different races, which were thought to justify the belief that people of African descent were not only less intelligent than those of European descent, but literally of a different species, and thus could be treated unjustly or enslaved on that basis. Major scientific figures, such as the famed zoologist Louis Agassiz, were supporters of this view, and lectures given throughout the South were aimed to reinforce not a scientific viewpoint but a moral conviction—one that reinforced the image of the white southern male as genetically and morally superior. As a result, views about human anatomy became convictions.24

  Or, to take an entirely different example, some Americans on the Left in the mid-twentieth century believed that the Soviet Union was misunderstood and more benign than their own government maintained. Some abandoned this belief in the face of news of Stalin’s purges, gulags, and the brutal suppression of dissent in Czechoslovakia and elsewhere. But others rationalized away accounts of Soviet oppression. Their beliefs about historical events had become morally entangled in their overall views about justice and therefore became convictions themselves, super resistant to counterevidence.

  We can see similar forces at work with regard to beliefs in the safety of vaccines, in the efficacy (or not) of free trade, and in the reality or unreality of climate change. The latter case in particular suggests that moral entanglement can work even when the explicit moral convictions are twice removed. Consider the claim made by some that climate change is a hoax perpetuated by the billionaire George Soros. If someone who believed this claim came to think it false, the shift in thinking probably wouldn’
t directly undermine any of that person’s moral convictions. But it would have an indirect effect. For example, accepting that climate change is real would partially undermine the related belief that the scientific establishment is a propaganda arm of a liberal conspiracy. And that fact, in turn, would directly undermine the explicitly moral commitment that the scientific establishment is morally corrupt and thus not to be trusted.

  Many, perhaps most, of our ordinary beliefs about the world are connected, if only distantly, with our moral principles (mathematical beliefs may be an exception). But that fact does not amount to moral entanglement. What I am calling moral entanglement is the process of coming to commit to a belief because it is perceived to have an evidential relationship to a strongly held moral conviction. That conception is consistent with the obvious fact that the process often goes the other way. Our commitments to matters of fact can cause us to develop certain moral convictions, as when a realization that opioid abuse is widespread makes someone come to view sentencing laws as unjust. That, too, could then count as moral entanglement, although it need not. Which things are morally entangled depends on the person.

  Moral entanglement works with social pressure to act as a mechanism for instilling conviction. Many of our beliefs become convictions because they have already been woven into the larger narrative of a tribe we aspire to remain a part of. They become morally entangled because those beliefs reflect who we think we want to be; and our emotional commitment remains, no matter what challenges come down the pike. Indeed, we take such challenges personally—literally, because we’ve expanded our self-identity, our self-narrative, to include the belief. That is how an ordinary belief in a matter of fact becomes what we could call a “blind conviction.” Blind convictions, like blind faith, are convictions formed not on the basis of evidence but on the basis of attitudes. We allow our self-identity, formed by our acceptance of wider cultural narratives and the attitudes they embody, to stretch and enlarge; it covers more ground and includes more as essential to it than it did before. Thanks to our unreflective desire to fit into wider narratives—to adopt them as our own—we take some matters of fact to be part of our identity, come what may.

 

‹ Prev