Book Read Free

Believing

Page 7

by Michael McGuire


  One evening four years later, in the comfort and protection of an apartment in Los Angeles, I opened a newly purchased book about mammals of the New World. I turned to the chapter on jaguars. It described how they attack prey from above, love water, and cover their tracks by entering a stream at one point and departing at another. As I read, I was consumed with anxiety and memories of that moment in the jungle. For at least an hour, I shook and literally couldn’t utter a coherent sentence.

  It is unlikely that other people would have responded to the presence of the jaguar as I did. Different responses to the same experience are as common as when two people are present at a robbery or at a rock concert. One’s emotional state, cognitive focus, assessment of context, and a multitude of often-ignored factors, such as haptic sensations (body activity), are associated with the ways information is processed.6

  Critical details of how individuals respond to and interpret their experiences, how they store them in memory, and the conditions under which they can be recalled are still to be clarified. But the jaguar incident is consistent with three points. Erroneous beliefs, such as that jaguars don’t like water, are associated with alterations in divide width. Belief may serve as a barrier to experiencing emotion. And features of experience of which one is unaware may remain unperceived for years if not for ever.

  To return to daily life and direct evidence, that a sharp knife cuts tomatoes easier and more efficiently than a dull knife can be experienced and serve as direct evidence. That automobiles don’t run indefinitely and must be refueled can be experienced. Beliefs build from such evidence. Many predictably predict outcomes. People regularly fuel their cars just as they select sharp knives for cutting food. About such beliefs and evidence there is near consensus. Divides are narrow or nonexistent. Such beliefs are the source of common-sense wisdom, which we associate with judgment, decision, and action. Those that question the wisdom can test it for themselves. And in all probability, postmodernists and now a group that identifies itself as post-postmodernists all prefer to cut tomatoes with sharp knives or, if not that, their spouses, parents, children, and friends do so.

  Much of the behavior of everyday life is guided by this common-sense wisdom. Our memory serves us well in that it is easy to recall which behaviors lead to predictable outcomes and which do not. People like to solve problems, and they usually do so as efficiently as possible. Carpenters, plumbers, gardeners, cooks, housewives, arborists, doctors, lawyers, musicians, artists, and architects all have tool kits composed of beliefs and procedures about how the world works in familiar situations. Daily these are associated with actions that have a high degree of certainty that they will achieve intended outcomes. Further, for many people, there is considerable redundancy in their day-to-day lives. Thus, much of the time, life moves along relatively trouble-free. Tool kits that worked yesterday will work today. If they don’t, they are rearranged.

  However direct, a single experience usually doesn’t serve as the source of a strongly held conviction. Though there are exceptions. Feeling the pain from placing one’s hand in a flame or biting into a hot chili pepper for the first time usually has both immediate and lasting influence. The housewife who washes clothes with a new soap only to find that the clothes are not clean won’t use the soap again. Nonetheless, most of the time, multiple direct-evidence experiences along with information from other sources are essential for the development of a belief. This is not without consequence: multiple sources means that insights dealing with why people believe some things and not others or how divides develop and are altered are difficult to identify.

  Direct evidence can mislead. For example, there are misperceptions such as momentarily believing that a person in a crowd is a friend. These are usually corrected by gathering further direct evidence. At other times, direct evidence is subject to multiple interpretations. This occurs when people call upon different beliefs to explain the same evidence.7 For example, persons A, B, and C observe a plant that has failed to grow. Person A believes that the failure is due to poor plant quality. Person B explains it as a consequence of inadequate soil nutrients. And person C insists that it’s due to the presence of bugs in the soil. All share the same evidence. Each offers a different explanation. Presumably the plant’s failure to grow could be studied and its cause clarified. However, in most daily-life situations, such studies occur rarely, which permits conflicting beliefs to persist unmodified.

  Direct evidence gathered while observing an event doesn’t mean that all the details of the event are perceived. This point is beautifully illustrated by the “invisible gorilla” experiment. The experiment is recounted by Christopher Chabris and Daniel Simons in their 2010 book The Invisible Gorilla.8

  The study works this way: Research subjects are shown a video of people passing basketballs back and forth. They are asked to count the number of passes. The video lasts for one minute. Partway through the video, a woman dressed in a gorilla suit appears for nine seconds, pounds her chest, and disappears. With the video over, subjects are asked several questions, one of which is, “Did you see a gorilla?” Roughly half of the subjects in the original study didn’t recall seeing a gorilla. The study, along with its many variations, has been repeated multiple times, each time with similar results.

  The explanation provided by the authors is that the subject’s failure to recognize the gorilla is due to the “illusion of attention”: “We experience far less of our visual worlds than we think we do.”9 Their interpretation is consistent with a key theme of this book: direct evidence may be incomplete or misperceived. It follows that beliefs that build from direct evidence can be wrong.

  There is also an identifiable hierarchy of interpretative preferences when dealing with evidence. Preferences are frequently observed between beliefs that people develop from direct evidence and those in which the source is a third person or authority. If religious beliefs are exempted, people are far more likely to favor beliefs based on direct evidence than those of a third party.10 We prefer to believe our own beliefs and divides rather than those of others. Likely this explains in part why people are resistant to changing their beliefs.

  INDIRECT EVIDENCE

  If direct evidence is about personal experience, indirect evidence is about information from secondary sources. Books, newspapers, TV, radio, the Internet, and gossip are examples. Consider the evening news: “Country X has demanded an apology from Country Y. . . . The New York Yankees beat Baltimore 7 to 3. . . . The Dow Jones dropped five points in late trading. . . . The president is expected to land in London in an hour.” Each of these statements may accurately describe what has happened or is expected to happen. Nonetheless, their source remains indirect, not direct.

  Surprisingly, we often perceive indirect evidence as crisper than direct evidence. Crispness may enhance its believability and authority. Statements made on radio, on TV, in newspapers, or by next-door neighbors often are crafted such that conflicting evidence or discordant views go unmentioned; in effect, evidence that could influence divides or disconfirm what is reported is left unstated or selectively interpreted. This is a form of censorship, although it is seldom viewed this way. At times, sources of indirect evidence attain the status of “truth purveyors.” This was the case with Walter Cronkite during the 1960s and 1970s on the CBS Evening News, which took on the aura of a “context of truth.”

  Several points follow. The most obvious is that indirect evidence often ignores the “other half of the story.” Hence its crispness. A second point is that event interpretation is influenced—often significantly—by the beliefs of interpreters. This is often evident when people tell of their personal experiences. It is equally if not more strikingly evident in the areas of politics, economics, and law: the same event can be structured, interpreted, and given meaning in multiple ways. A third point is that for those whose only source of evidence is secondary, there is no foolproof way of discerning the degree to which evidence has been structured and given meaning by others. In co
ntrast, it’s easy to select examples that are consistent with what one believes and thereby narrow divides.

  OTHER TYPES OF EVIDENCE AND EXCEPTIONS

  There are other types of evidence. Legal evidence is that which is admissible in courts of law and for which there are stringent requirements. Direct evidence is admissible and, at times, indirect evidence may be too. There is also circumstantial evidence. This consists of packages of direct and indirect evidence that tend to “prove” an event or a fact by identifying other events or circumstances that afford a basis for believing in the occurrence or nonoccurrence of an event or fact. At times, it is acceptable in courts. (It is a well-established staple of mystery novels.)

  For the discerning analyst, circumstantial evidence might be viewed as little more than sophisticated conjecture. Yet it often has a very different use in daily life when it is the basis for what people believe. For example, person X is a supervisor in County Y. It is public knowledge that, during the coming year, the county will be unable to pay its employees due to a shortage of funds from taxes and other sources. The county supervisors meet and recommend firing 25 percent of the county’s workforce. Supervisor X is absent from the meeting. The circumstantial conclusion is that supervisor X is not supporting the proposed firing or doesn’t want to be associated with it.

  There is yet another side to evidence-belief relationships: sometimes believing in advance of possessing evidence pays off. That is, believing first and finding evidence second may be a more efficient strategy than the reverse, namely, acquiring bits of evidence and trying to piece together a plausible story or belief.11

  Presumably because parents and teachers sense that this strategy invites mistakes and wastes time, it is rarely recommend. Yet in 1871, Heinrich Schliemann believed that the ancient city of Troy could be found before he set forth in search of evidence. He discovered the evidence he sought in the poems of Homer written 2,700 years earlier. They provided reliable hints about the city’s location. Homer and Schliemann were right. Troy is where Homer said it would be, a few miles north of the Dardanelles.12 A similar story applies to lasers. They began as an idea without evidence. Searches for evidence and experiments followed. Discovery was next.13 In a similar effort, a group of scientists recently launched a plan to locate forgotten and unpublished data, possibly residing in basements and drawers of scientists around the world, that they believe may have scientific import.14 While most scientists might be reluctant to admit it, many have followed the belief-first, evidence-second strategy with success.

  There is also a case to be made for not searching for evidence. For a few decisions, such as which new home to purchase, a detailed search is probably wise. However, the often-demanding requirements of daily life and the many decisions it requires may render extensive searches for direct evidence costly and unproductive. Hence the attractiveness of indirect evidence and its explanations, such as those found in cook books, instruction manuals, road maps, or those offered by “experts.”

  Then there are bits and pieces of suggestive but often-unconnected evidence that are associated with various but far from compelling beliefs with indeterminate divides. For example, during the coming year, those who wish to do so can experience another dozen or more new articles and books questioning Shakespeare’s authorship for many of the works traditionally attributed to him by ascribing them to Francis Bacon, Edmund Spenser, or Christopher Marlowe.15

  What implications might be drawn from the preceding discussion? One is that what constitutes direct evidence is a more complex matter than is often appreciated. We believe our experiences. No other convenient choice is available. Yet direct evidence can deceive. It can be incomplete, be misleading, and undergo alterations through time due to changes in memory.16 Further, distinguishing between what constitutes beliefs and what is evidence is not always an easy or straightforward matter. Indeed, the issue is of enough concern to the United States National Science Foundation that it has recently initiated an effort that aims to separate evidence from belief.

  Another is that, despite the absence of evidence, highly accurate predictions are often possible. Most people are not experts in how fuel burns in automobile engines, yet they have figured out ways not to run out of gas. Nor is their figuring likely to improve even if they become experts in fuel combustion. So, at times, wide divides are present in matters that may affect understanding, yet reducing them may not improve predictions associated with beliefs.

  Yet another is that people are highly dependent and responsive to indirect evidence. Again, there is no convenient alternative—daily life requires decisions even when direct evidence is unavailable.

  INFERENCE AND INTUITION

  On its own, evidence doesn’t explain itself. We may sense that it does when our interpretations rapidly accompany experience. Most often, however, evidence appears in bits and pieces and lacks organization. Making sense of it requires sifting, organization, and interpretation. Inference—to deduce or reason from evidence to possible causes or outcomes—is one interpretation strategy.

  Our inferences are most convincing when they build on direct evidence and familiar explanations or models of how the world works. For example, person T doesn’t drain his outside water pipes; a subfreezing cold spell arrives, and, following that, his outside pipes burst. The inference is that the cold weather caused the water to freeze, expand, and break the pipes.

  Inferences are not free of constraints and limitations. As noted, two people may have access to the same evidence, yet they develop different inferences due to unshared beliefs about how the world works.17 As the authors of the invisible-gorilla study put it, “Our minds are built to detect meaning in patterns, to infer causal relationships from coincidences, and that earlier events cause later ones.”18 This may be so, but it doesn’t mean that any two people detect meaning or infer causal relationships in the same way.

  Culture can be an influencing factor. Studies show that in explaining events, Westerners are inclined to attend to a focal object, such as why two fruit trees of the same species, size, and age bear vastly different amounts of fruit, and then reason about possible causes for the difference. In contrast, east Asians are more likely to attend to broad perceptual and conceptual fields and group objects based on family resemblance rather than category membership. Or, when North Americans try to discern how a person in a group is feeling, they concentrate primarily on the person while Japanese consider the emotions of the other people in the group.19 Similar findings are reported for people who have experienced intense religious indoctrination. Calvinists, who stress the role of the individual, show greater attentiveness to local features compared to Catholics and Jews, whose traditions stress social togetherness.20 With this range of interpretative influences and options, it is not surprising that when two people interpret the same evidence, their interpretations rarely match exactly.21

  Then there is intuition: a brain information-processing system that often provides ready insight and explanation and may be associated with near-instant action. Intuitive primacy is a term often used to refer to this system. It has been described as “human emotions” and “gut feelings” that drive judgments and action.22

  Intuition operates “outside” awareness. It can be automatic, quick, and often highly efficient, as in situations in which one engages in on-the-spot actions. It is at work when one is walking along a crowded street that requires the rapid integration of complex information and action to avoid contact with other pedestrians.

  Intuition also asserts itself in decision making, problem solving, and belief creation, even when a more rational approach is the better choice. For example, college students are given the following problem: “A bat and a ball cost $1.10 in total. The bat costs a dollar more than the ball. How much does the ball cost?” Approximately 50 percent of students say the ball costs ten cents. The correct answer is five cents. But for many students, “ten cents feels right,” they argue. Further, it is far from clear if anyone is exempt from mo
ments of intuitive primacy or its errors. Studies suggest that people who are intuitive and in a good mood will believe almost anything.23 This point apparently applies even among those whose professions profess the importance of rational thinking; for example, 95 percent of college professors believe that the quality of their work is superior to that of colleagues working in the same field. Ninety-five percent can’t be inference. It must be intuition.

  At times, inference and intuition appear to work hand in hand to reduce divides. Psychologists interested in the causes of wrong beliefs have identified the process of illusory correlation, which leads to selectively remembering more confirming evidence compared to disconfirming evidence: in effect, it’s a kind of cognitive slight-of-hand that produces a desired outcome. Data distortions also occur: confirming cases are created—that is, imagined—and disconfirming cases are ignored.24 Who has not reasoned this way? Further, inference and intuition often have their own agendas. This happens when there is an abundance of evidence and two people with the same belief associate their beliefs with only a selected subset of available evidence. People with strong religious and political beliefs frequently fit this description.

  Scientists studying intuition point out that, without serious conscious effort at “rational thinking,” intuitive primacy doesn’t self-correct.25 Possible reasons why are suggested by comparing the belief-creation and divide-reducing features of intuition with those of inference. They work differently. The deductive steps of inference can be recalled or evaluated another day, and their logic can be revised, if need be. Intuition too can be recalled, evaluated, and revised, but its logic remains a mystery.

 

‹ Prev