Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time

Home > Other > Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time > Page 39
Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time Page 39

by Michael Shermer


  WHY PEOPLE BELIEVE IN GOD

  1. Arguments based on good design/natural beauty/perfection/complexity of the world or universe. (28.6%)

  2. The experience of God in everyday life/a feeling that God is in us. (20.6%)

  3. Belief in God is comforting, relieving, consoling, and gives meaning and purpose to life. (10.3%)

  4. The Bible says so. (9.8%)

  5. Just because/faith/or the need to believe in something. (8.2%)

  WHY PEOPLE THINK OTHER PEOPLE BELIEVE IN GOD

  1. Belief in God is comforting, relieving, consoling, and gives meaning and purpose to life. (26.3%)

  2. Religious people have been raised to believe in God. (22.4%)

  3. The experience of God in everyday life/a feeling that God is in us. (16.2%)

  4. Just because/faith/or the need to believe in something. (13.0%)

  5. People believe because they fear death and the unknown. (9.1%)

  6. Arguments based on good design/natural beauty/perfection/ complexity of the world or universe. (6.0%)

  Note that the intellectually based reasons for belief in God of "good design" and "experience of God," which were in 1st and 2nd place in the first question of why do you believe in God?, dropped to 6th and 3rd place for the second question of why do you think other people believe in God? Taking their place as the two most common reasons given for why other people believe in God were the emotionally based categories of religion being judged as "comforting" and people having been "raised to believe" in God. Grouping the answers into two general categories of rational reasons and emotional reasons for belief in God, we performed a Chi-Square test and found the difference to be significant (Chi-Square[l] = 328.63 [r =.49], N = 1,356, p < .0001). With an odds ratio of 8.8 to 1, we may conclude that people are nearly nine times more likely to attribute their own belief in God to rational reasons than they are other people's belief in God, which they will attribute to emotional reasons.

  One explanation for this finding is the attribution bias, or the attribution of causes of our own and others' behaviors to either a situation or a disposition. When we make a situational attribution, we identify the cause in the environment ("my depression is caused by a death in the family"); when we make a dispositional attribution, we identify the cause in the person as an enduring trait ("her depression is caused by a melancholy personality"). Problems in attribution may arise in our haste to accept the first cause that comes to mind (Gilbert et al. 1988). Plus, social psychologists Carol Tavris and Carole Wade (1997) explain that there is a tendency for people "to take credit for their good actions (a dispositional attribution) and let the situation account for their bad ones." In dealing with others, for example, we might attribute our own success to hard work and intelligence, whereas the other person's success is attributed to luck and circumstance (Nisbett and Ross 1980).

  We believe that we found evidence for an intellectual attribution bias, where we consider our own actions as being rationally motivated, whereas we see those of others as more emotionally driven. Our commitment to a belief is attributed to a rational decision and intellectual choice ("I'm against gun control because statistics show that crime decreases when gun ownership increases"); whereas the other person's belief is attributed to need and emotion ("he's for gun control because he's a bleeding-heart liberal who needs to identify with the victim"). This intellectual attribution bias applies to religion as a belief system and to God as the subject of belief. As pattern-seeking animals, the matter of the apparent good design of the universe, and the perceived action of a higher intelligence in the day-to-day contingencies of our lives, is a powerful one as an intellectual justification for belief. But we attribute other people's religious beliefs to their emotional needs and upbringing.

  Smart people, because they are more intelligent and better educated, are better able to give intellectual reasons justifying their beliefs that they arrived at for nonintellectual reasons. Yet smart people, like everyone else, recognize that emotional needs and being raised to believe something are how most of us most of the time come to our beliefs. The intellectual attribution bias then kicks in, especially in smart people, to justify those beliefs, no matter how weird they may be.

  Confirmation Bias. At the core of the Easy Answer to the Hard Question is the confirmation bias, or the tendency to seek or interpret evidence favorable to already existing beliefs, and to ignore or reinterpret evidence unfavorable to already existing beliefs. Psychologist Raymond Nickerson (1998), in a comprehensive review of the literature on this bias, concluded: "If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. ... it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations."

  Although lawyers purposefully employ a type of confirmation bias in the confrontational style of reasoning used in the courtroom by purposefully selecting evidence that best suits their client and ignoring contradictory evidence (where winning the case trumps the truth or falsity of the claim), psychologists believe that, in fact, we all do this, usually unconsciously. In a 1989 study, psychologists Bonnie Sherman and Ziva Kunda presented students with evidence that contradicted a belief they held deeply, and with evidence that supported those same beliefs; the students tended to attenuate the validity of the first set of evidence and accentuate the value of the second. In a 1989 study with both children and young adults who were exposed to evidence inconsistent with a theory they preferred, Deanna Kuhn found that they "either failed to acknowledge discrepant evidence or attended to it in a selective, distorting manner. Identical evidence was interpreted one way in relation to a favored theory and another way in relation to a theory that was not favored." Even in recall after the experiment, subjects could not remember what the contradictory evidence was that was presented. In a subsequent study in 1994, Kuhn exposed subjects to an audio recording of an actual murder trial and discovered that instead of evaluating the evidence objectively, most subjects first composed a story of what happened, and then sorted through the evidence to see what best fit that story. Interestingly, those subjects most focused on finding evidence for a single view of what happened (as opposed to those subjects willing to at least consider an alternative scenario) were the most confident in their decision.

  Even in judging something as subjective as personality, psychologists have found that we see what we are looking for in a person. In a series of studies subjects were asked to assess the personality of someone they were about to meet, some given a profile of an introvert (shy, timid, quiet), others given a profile of an extrovert (sociable, talkative, outgoing). When asked to make a personality assessment, those told that the person would be an extrovert asked questions that would lead to that conclusion; the group given the introvert profile did the same. They both found in the person the personality they were seeking to find (Snyder 1981). Of course, the confirmation bias works both ways in this experiment. It turns out that the subjects whose personalities were being evaluated tended to give answers that would confirm whatever hypothesis the interrogator was holding.

  The confirmation bias is not only pervasive, but its effects can be powerfully influential on people's lives. In a 1983 study, John Darley and Paul Gross showed subjects a video of a child taking a test. One group was told that the child was from a high socioeconomic class while the other group was told that the child was from a low socioeconomic class. The subjects were then asked to evaluate the academic abilities of the child based on the results of the test. Not surprisingly, the group told of the high socioeconomic class rated the child's abilities as above grade level, while the group that was told the child was from a low socioeconomic class rated the child's abilities as below grade level. In other words, the same data were seen by one group
of evaluators differently than the other group, depending on what their expectations were. The data then confirmed those expectations.

  The confirmation bias can also overwhelm one's emotional states and prejudices. Hypochondriacs interpret every little ache and pain as indications of the next great health calamity, whereas normal people simply ignore such random bodily signals (Pennebaker and Skelton 1978). Paranoia is another form of confirmation bias, where if you strongly believe that "they" are out to get you, then you will interpret the wide diversity of anomalies and coincidences in life to be evidence in support of that paranoid hypothesis. Likewise, prejudice depends on a type of confirmation bias, where the prejudged expectations of a group's characteristics leads one to evaluate an individual who is a member of that group in terms of those expectations (Hamilton et al. 1985). Even in depression, people tend to focus on those events and information that further reinforce the depression, and suppress evidence that things are, in fact, getting better (Beck 1976). As Nickerson noted in summary: "the presumption of a relationship predisposes one to find evidence of that relationship, even when there is none to be found or, if there is evidence to be found, to overweight it and arrive at a conclusion that goes beyond what the evidence justifies."

  Even scientists are subject to the confirmation bias. Often in search of a particular phenomenon, scientists interpreting data may see (or select) those data most in support of the hypothesis under question and ignore (or toss out) those data not in support of the hypothesis. Historians of science have determined, for example, that in one of the most famous experiments in the history of science, the confirmation bias was hard at work. In 1919, the British astronomer Arthur Stanley Eddington tested Einstein's prediction for how much the sun would deflect light coming from a background star during an eclipse (the only time you can see stars behind the sun). It turns out that Eddington's measurement error was as great as the effect he was measuring. As Stephen Hawking (1988) described it, "The British team's measurement had been sheer luck, or a case of knowing the result they wanted to get, not an uncommon occurrence in science." In going through Eddington's original data, historians S. Collins and J. Pinch (1993) found that "Eddington could only claim to have confirmed Einstein because he used Einstein's derivations in deciding what his observations really were, while Einstein's derivations only became accepted because Eddington's observation seemed to confirm them. Observation and prediction were linked in a circle of mutual confirmation rather than being independent of each other as we would expect according to the conventional idea of an experiment test." In other words, Eddington found what he was looking for. Of course, science contains a special self-correcting mechanism to get around the confirmation bias: other people will check your results or rerun the experiment. If your results were entirely the product of the confirmation bias, someone will sooner or later catch you on it. That is what sets science apart from all other ways of knowing.

  Finally, and most importantly for our purposes here, the confirmation bias operates to confirm and justify weird beliefs. Psychics, fortune tellers, palm readers, and astrologers, for example, all depend on the power of the confirmation bias by telling their clients (some would call them "marks") what to expect in their future. By offering them one-sided events (instead of two-sided events in which more than one outcome is possible), the occurrence of the event is noticed while the nonoccurrence of the event is not. Consider numerology. The search for meaningful relationships in various measurements and numbers available in almost any structure in the world (including the world itself, as well as the cosmos) has led numerous observers to find deep meaning in the relationship between these numbers. The process is simple. You can start off with the number you seek and try to find some relationship that ends in that number, or one close to it. Or, more commonly, you crunch through the numbers and see what pops out of the data that looks familiar. In the Great Pyramid, for example (as discussed in chapter 16), the ratio of the pyramid's base to the width of a casing stone is 365, the number of days in the year. Such number crunching with the confirmation bias in place has led people to "discover" in the pyramid the earth's mean density, the period of precession of the earth's axis, and the mean temperature of the earth's surface. As Martin Gardner (1957) wryly noted, this is a classic example of "the ease with which an intelligent man, passionately convinced of a theory, can manipulate his subject matter in such a way as to make it conform to precisely held opinions." And the more intelligent the better.

  So, in sum, being either high or low in intelligence is orthogonal to and independent of the normalness or weirdness of beliefs one holds. But these variables are not without some interaction effects. High intelligence, as noted in my Easy Answer, makes one skilled at defending beliefs arrived at for non-smart reasons. In chapter 3 I discuss a study conducted by psychologist David Perkins (1981), in which he found a positive relationship between intelligence and the ability to justify beliefs, and a negative relationship between intelligence and the ability to consider other beliefs as viable. That is to say, smart people are better at rationalizing their beliefs with reasoned arguments, but as a consequence they are less open to considering other positions. So, although intelligence does not affect what you believe, it does influence how beliefs are justified, rationalized, and defended after the beliefs are acquired for non-smart reasons.

  Enough theory. As the architect Mies van der Rohe noted, God dwells in the details. The following examples of the difference between intelligence and belief are carefully chosen not from the lunatic fringe or culturally marginalized, but from the socially mainstream and especially from the academy. That is what makes the Hard Question so hard. It is one thing to evaluate the claims of a government coverup from a raving conspiratorialist publishing a newsletter out of his garage in Fringeville, Idaho; it is quite another when it comes from a Columbia University political science professor, or from a Temple University history professor, or from an Emory University social scientist, or from a multimillionaire business genius from Silicon Valley, or from a Pulitzer Prize-winning professor of psychiatry at Harvard University.

  UFOs and Alien Abductions: A Weird Belief with Smart Supporters

  UFOs and alien abductions meet my criteria for a weird thing because the claim that such sightings and experiences represent actual encounters with extraterrestrial intelligences is (1) unaccepted by most people in astronomy, exobiology, and the Search for Extra-Terrestrial Intelligence (despite the near universal desire by practitioners to find life of any grade somewhere other than earth), (2) extremely unlikely (although not logically impossible), and (3) is largely based on anecdotal and uncorroborated evidence. Are UFO and alien abduction claims supported by smart people? While the community of believers used to be populated largely by those in the nooks and crannies of society's fringes, they have successfully migrated into the cultural mainstream. In the 1950s and 1960s, those who told stories of alien encounters were, at best, snickered at behind closed doors (and sometimes when the doors were wide open) or, at worst, sent to psychiatrists for mental health evaluations. And they were always the butt of jokes among scientists. But in the 1970s and 1980s a gradual shift occurred in the credentials of the believers, and in the 1990s they received a boost from the academy that has helped metastasize their beliefs into society's main body.

  Consider Jodi Dean's widely reviewed 1998 book Aliens in America. Dean is a Columbia University Ph.D., a professor of political science at Hobart and William Smith Colleges, and a noted feminist scholar. Her book is published by Cornell University Press and begins as if it is going to be a thoughtful sociology of UFOlogy with a thesis that abductees feel "alienated" from modern American society because of economic insecurities, threats of environmental destruction, worldwide militarism, colonialism, racism, misogyny, and other cultural bogeymen: "My argument is that the aliens infiltrating American popular cultures provide icons through which to access the new conditions of democratic politics at the millennium." Since Dean rejects science
and rationality as methods of discriminating between sense and nonsense, we "have no criteria for choosing among policies and verdicts, treatments and claims. Even further, we have no recourse to procedures, be they scientific or juridical, that might provide some 'supposition of reasonableness.' " For Dean, not only is science not a solution, it is part of the problem: " 'Scientists' are the ones who have problems with the 'rationality' of those in the UFO community. 'Scientists' are the ones who feel a need to explain why some people believe in flying saucers, or who dismiss those who do so as 'distorted' or 'prejudiced' or 'ignorant.' " Indeed, Dean concludes, since postmodernism has shown all truth to be relative and consensual, then the UFOlogists' claims are as true as anyone's claims: "The early ufologists fought against essentialist understandings of truth that would inscribe truth in objects (and relations between objects) in the world. Rejecting this idea, they relied on an understanding of truth as consensual. If our living in the world is an outcome of a consensus on reality, then stop and notice that not everyone is consenting to the view of reality espoused by science and government."

  With this relativist view of truth Dean never tells us whether she believes the UFO/abduction narratives told by her subjects. So I asked her just that in a radio interview, to which she replied: "I believe that they believe their stories." I acknowledged the clarification but pressed the point: "But what do you believe?" Dean refused to answer the question. Fair enough, I suppose, since she is trying to take a nonjudgmental perspective (although I could not get her to offer an opinion even off the air and off the record). But my point here is that by so doing this smart person is lending credence to a weird belief, adding to its credibility as an acceptable tenet of truth that should be part of acceptable social dialogue when, in fact, there is no more evidence for the existence of aliens on earth than there is for fairies (which, in the 1920s, enjoyed their own cultural heyday and the backing of smart people like the creator of Sherlock Holmes, Arthur Conan Doyle; see Randi 1982).

 

‹ Prev