unSpun

Home > Other > unSpun > Page 7
unSpun Page 7

by Brooks Jackson


  Chapter 4

  UFO Cults and Us

  Why We Get Spun

  A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.

  —LEON FESTINGER, et al., When Prophecy Fails (1956)

  HAVE YOU EVER WONDERED WHY OTHER PEOPLE ARE SO UNREASONABLE and hard to convince? Why is it that they disregard hard facts that prove you’re right and they’re wrong? The fact is, we humans aren’t wired to think very rationally. That’s been confirmed recently by brain scans, but our irrational reaction to hard evidence has been the subject of scholarly study for some time. Consider one of the most famous scientific observations in all of psychology, the story of a UFO cult that was infiltrated by the social psychologist Leon Festinger and his colleagues half a century ago.

  They observed a small group of true believers whose leader was a woman the authors called Marian Keech, in a place they called Lake City. Mrs. Keech said she had received messages from beings called Guardians on Planet Clarion saying that North America would be destroyed by a flood, but that her followers would be taken to safety on a UFO a few hours earlier, at midnight on December 21, 1954. Members of her cult quit jobs, sold possessions, dropped out of school, and prepared for the space journey by removing metal objects from their clothing as instructed by Mrs. Keech. They gathered in her living room on the appointed night, and waited.

  Midnight came and went, and of course neither the flood nor the promised spacecraft arrived. But did the faithful look at this incontrovertible evidence and conclude, “Oh, well, we were wrong?” Not a chance. At 4:45 A.M., as morning approached, Mrs. Keech told them that she had received another message: the cataclysm had been called off because of the believers’ devotion. “Not since the beginning of time upon this Earth,” she said, “has there been such a force of Good and light as now floods this room.” And then she told the group to go spread the word of this miracle. One member walked out, fed up with Keech’s failed predictions, but the others were jubilant. In the days that followed, Festinger and his colleagues reported, cult members who previously had shown no interest in proselytizing now put their energies into gaining converts. They became even more committed to their cause after seeing what any reasonable person would conclude was shattering proof that they had been completely wrong.

  Why should this be? The reason, Festinger tells us, is that it is psychologically painful to be confronted with information that contradicts what we believe. For Mrs. Keech’s followers to concede error would have meant admitting they had been colossal fools. But convincing others of their beliefs would not only avoid any such embarrassment, it would provide some false comfort, as though more believers would constitute more proof.

  The Keech cult is an extreme case, but the discomfort at being confronted with evidence of error is a universal human emotion. It’s just no fun to admit we’ve been wrong. So we strive to avoid that unpleasant feeling of psychological conflict—what Festinger calls cognitive dissonance—that occurs when deeply held beliefs are challenged by conflicting evidence. Keech’s followers could have reduced their dissonance by abandoning their religious beliefs, but instead some of them adopted an explanation of the new evidence that was compatible with what they already believed. Cognitive dissonance is a fancy term for something all of us feel from time to time. Decades of social science experiments have shown that, in a sense, there’s a little UFO cultist in everybody.

  The Moonbat Effect

  A more recent example took place in 2006 when a website called truthout.org reported the political equivalent of a UFO landing that didn’t happen. On Saturday, May 13, the truthout contributor Jason Leopold reported that President Bush’s top political adviser, Karl Rove, had been indicted for perjury and lying to investigators in a CIA leak investigation, that Rove’s lawyer had been served with the indictment papers during a fifteen-hour meeting at his law offices the previous day, and that the prosecutor would announce the charges publicly within a week. All this was reported as fact, without any “maybes” or qualifications.

  The news was celebrated by followers of truthout, a liberal, anti-Bush site. One commenter on the website’s “Town Meeting” message board even suggested a war crimes trial, saying “How about sending him and his bosses to The Hague?” Another gloated, “All that needs to be done now is to send him to the crossbar hotel, under the loving care of Bubba & No-Neck.” The fact that no major news organization was reporting the Rove indictment was brushed off as evidence of the supposed pro-Bush, antiliberal bias of the “MSM” or mainstream media.

  But the story quickly drew strong denials. Rove’s lawyer said he, the lawyer, was out of his office that day taking his sick cat to the veterinarian and certainly hadn’t been served with an indictment. Days passed with no indictment announced, and on Thursday truthout’s executive director, Marc Ash, said that “additional sources have now come forward and offered corroboration to us.” At the end of the week in which the indictment was supposed to be announced, Ash issued a “partial apology,” but only about the timing: “We erred in getting too far out in front of the news-cycle.” He kept the original story on the website. On May 25 Ash wrote, “We have now three independent sources confirming that attorneys for Karl Rove were handed an indictment.” On June 3 he wrote, “We stand by the story.” And still no indictment.

  A number of readers expressed disappointment but others seemed to cling even more firmly to their now discredited editor and reporter’s story. “Mr. Leopold’s version may have been perfectly correct but changed by subsequent events he was not privy to,” wrote one, posting a comment on truthout’s “Town Meeting” message board. Another wrote, “No apology needed. I am glad you took the risk and you confirmed my faith in you.” Still another wrote, “Persevere, keep the faith and above all keep up the good work. You can expect my contribution next week.” Even when Rove’s lawyer finally announced on June 13 that the special prosecutor had informed him that Rove wasn’t going to be indicted, a diehard few continued to believe truthout. “I know that I trust their word over what the media says,” one supporter wrote. Ash still kept the indictment story on the website and said of Rove’s lawyer, “We question his motives.”

  However, by this time most of the comments on the “Town Meeting” board reflected a view of reporter Leopold and editor Ash as doggedly self-deceived, or worse. One wrote, “I’ve been on your side all along and am a great believer in TO [truthout], but this borders on lunacy.” And another said, “There’s probably a 12 step program out there for this affliction. You believe, with all your heart, soul, mind and body that a story is correct. You base your belief, not on evidence, logic or reason, but simply because you want to believe so badly, the thought of it being wrong invalidates your very existence and that makes your head hurt.” That didn’t please truthout’s proprietors, who removed the comment. But it describes the effects of cognitive dissonance pretty well.

  Extreme political partisans sometimes display such irrational thinking that they have come to be dismissed as “barking moonbats.” A barking moonbat is “someone who sacrifices sanity for the sake of consistency,” according to the London blogger Adriana Cronin-Lukas of Samizdata.net, who helped popularize the colorful phrase. It is most often applied derisively to extreme partisans on the left, but we use it as originally intended, to apply to all farout cases whose beliefs make them oblivious to facts, regardless of party or ideology.

  The Psychology of Deception

  In the past half century, the science of psychology has taught us a lot about how and why we get things wrong. As we’ll see, our minds betray us not only when it comes to politics, but in all sorts of matters, from how we see a sporting event, or even a war, to the way we process a sales pitch. Humans are not by nature the fact-driven, rational beings we like to think we are. We get the facts wrong more often than we think we do. And we do so in predictable ways: w
e engage in wishful thinking. We embrace information that supports our beliefs and reject evidence that challenges them. Our minds tend to take shortcuts, which require some effort to avoid. Only a few of us go to moonbat extremes, but more often than most of us would imagine, the human mind operates in ways that defy logic.

  Psychological experiments have shown, for one thing, that humans tend to seek out even weak evidence to support their existing beliefs, and to ignore evidence that undercuts those beliefs. In the process, we apply stringent tests to evidence we don’t want to hear, while letting slide uncritically into our minds any information that suits our needs. Psychology also tells us that we rarely work through reasons and evidence in a systematic way, weighing information carefully and suspending the impulse to draw conclusions. Instead, much of the time we use mental shortcuts or rules of thumb that save us mental effort. These habits often work reasonably well, but they also can lead us to conclusions we might dismiss if we applied more thought.

  Another common psychological trap is that we tend to overgeneralize from vivid, dramatic single examples. News of a terrible airline crash makes us think of commercial flying as dangerous; we forget that more than 10 million airline passenger flights land safely every year in the United States alone. Overhyped coverage of a few dramatic crimes on the local news makes us leap to the conclusion that crime is rampant, even when a look at some hard statistics would convince us that the violent crime rate in 2005 (despite a small increase that year) was actually 38 percent lower than its peak in 1991.

  Psychologists have also found that when we feel most strongly that we are right, we may in fact be wrong. And they have found that people making an argument or a supposedly factual claim can manipulate us by the words they choose and the way they present their case. We can’t avoid letting language do our thinking for us, but we can become more aware of how and when language is steering us toward a conclusion that, upon reflection, we might choose to reject. In this chapter we’ll describe ways in which we get spun, and how to avoid the psychological pitfalls that lead us to ignore facts or believe bad information.

  And this has nothing to do with intelligence. Presidents, poets, and even professors and journalists fall into these traps. You do it. We all do it. But we are less likely to do it if we learn to recognize how our minds can trick us, and learn to step around the mental snares nature has set for us. The late Amos Tversky, a psychologist who pioneered the study of decision errors in the 1970s, frequently said, “Whenever there is a simple error that most laymen fall for, there is always a slightly more sophisticated version of the same problem that experts fall for.”*1

  Experts such as the doctors, lawyers, and college professors who made up a group convened in 1988 by Kathleen Jamieson to study the impact of political ads. They all thought other people were being fooled by advertising, but that they themselves were not. They were asked, “What effect, if any, do you think the presidential ads are having on voters in general and us in particular?” Every person in the group said that voters were being manipulated. But each one also insisted that he or she was invulnerable to persuasion by ads. Then the discussion moved on to the question “What ads have you seen in the past week?” One participant recounted that he had seen an ad on Democratic presidential candidate Michael Dukakis’s “revolving-door prison policy.” The group then spent time talking about Dukakis’s policies on crime control, and in that discussion each adopted the language used in the Republican ad. The ad’s language had begun to do their thinking about Dukakis for them. Without being aware of it, they had absorbed and embraced the metaphor that Dukakis had installed a “revolving door” on state prisons. The notion that others will be affected by exposure to messages while we are immune is called the third-person effect.

  The other side of the third-person effect is wishful thinking. Why do most people think they are more likely than they actually are to live past eighty? Why do most believe that they are better than average drivers? At times we all seem to live in Garrison Keillor’s fictional Lake Wobegon, where “all the children are above average.” To put it differently: in some matters we are unrealistic about how unrealistic we actually are.

  The “Pictures in Our Heads” Trap

  Misinformation is easy to accept if it reinforces what we already believe. The journalist Walter Lippman observed in 1922 that we are all captives of the “pictures in our heads,” which don’t always conform to the real world. “The way in which the world is imagined determines at any particular moment what men will do,” Lippman wrote in his book Public Opinion. Deceivers are aware of the human tendency to think in terms of stereotypes, and they exploit it.

  A good example of this is the way the Bush campaign in 2004 got away with falsely accusing John Kerry of repeatedly trying to raise gasoline taxes and of favoring a 50-cent increase at a time when fuel prices were soaring. A typical ad said, “He supported higher gasoline taxes eleven times.” In fact, Kerry had voted for a single 4.3-cent increase, in 1993. The following year, he flirted briefly in a newspaper interview with the idea of a 50-cent-a-gallon gasoline tax, but quickly dropped the notion and never mentioned it again. Nevertheless, most voters swallowed the Republican falsehood without a hiccup.

  Democrats exploit this psychological trap, too. In the previous chapter we noted how Kerry had falsely accused Bush of favoring big “cuts” in Social Security benefits. That was also widely believed. In the National Annenberg Election Survey, nearly half of respondents—49 percent—tended to believe a statement that Bush’s plan would “cut Social Security benefits 30 to 45 percent.” Only 37 percent found the statement untruthful. And we want to stress an important point here: the public wasn’t thinking of a “cut” as a slowdown in the growth of benefits for those retiring decades in the future; they were thinking of a real cut. The Annenberg survey found that 50 percent rated as truthful a statement that “Bush’s plan to cut Social Security would cut benefits for those currently receiving them,” and only 43 percent found the statement untruthful.

  Bush, of course, had stated repeatedly during his 2000 campaign and throughout his first term that he would not cut current benefits by a penny. So why would a plurality of the public believe the opposite? The reason, we believe, is that Social Security is thought of as a Democratic program and Republicans are thought of as opposing it (somewhat unfairly: large majorities of Republican House and Senate members voted for it when it was passed in 1935). Because of the stereotyped view that Republicans don’t favor Social Security, too few took the trouble to question the idea that Bush would cut current benefits.

  It is difficult for candidates to overcome these stereotypes. In 1990, voters in Pennsylvania were disposed to believe that the Republican nominee, Barbara Hafer, opposed abortion rights and that the incumbent Democratic governor, Robert Casey, favored a legal right to abortion; the reverse was true. In general, Democrats are more likely to be “pro-choice” and Republicans “pro-life,” but not in this case. A good rule is that “in general” doesn’t necessarily apply to “this specific.”

  Sometimes we have to avoid mental shortcuts and take the long way around if we want to avoid being manipulated or making these mistakes on our own. Ask yourself: “Is the picture in my head a good likeness of reality? Does this Democrat in fact favor this tax increase? Does this Republican in fact want to cut Social Security benefits? Where’s the evidence?”

  The “Root for My Side” Trap

  Related to the “pictures in our heads” trap is the “root for my side” trap. There’s evidence that our commitment to a cause not only colors our thinking but also affects what we see—and don’t see—as we observe the world around us.

  Psychologists have known about this phenomenon for a long time. In 1954, Albert Hastorf and Hadley Cantril published a classic study of how Princeton and Dartmouth football fans saw a penalty-ridden game in which the Princeton quarterback was taken off the field with a broken nose and a mild concussion and a Dartmouth player later suffered a b
roken leg. They found that 86 percent of the Princeton students said that Dartmouth started the rough play, but only 36 percent of the Dartmouth students saw it that way. The researchers asked, “Do you believe the game was clean and fairly played or that it was unnecessarily rough and dirty?” The results: 93 percent of Princeton fans said “rough and dirty,” compared with only 42 percent of Dartmouth fans. When shown films of the game and asked to count actual infractions by Dartmouth players, Princeton students spotted an average of 9.8, twice as many as the 4.3 infractions noted by Dartmouth students.

  Were the Dartmouth and Princeton fans deliberately distorting what they saw? Probably not. Hastorf and Cantril went so far as to say that people don’t just have different attitudes about things, they actually see different things: “For the ‘thing’ simply is not the same for different people whether the ‘thing’ is a football game, a presidential candidate, Communism or spinach.”

  We don’t know about spinach, but when it comes to presidential candidates a recent study using brain-scanning technology supports the notion that people really do see things differently, depending on whom they back. The scans also suggest that emotion takes over and logic doesn’t come into play. (See “This is your brain on politics” box.)

  A closely related effect is what researchers call the hostile media phenomenon. Three Stanford University researchers demonstrated this in 1985 by showing pro-Arab and pro-Israeli audiences identical news accounts of the massacre of several hundred persons in the Sabra and Shatila refugee camps near Beirut, Lebanon, in 1982. Each side detected more negative than positive references to their side, and each side thought the coverage was likely to sway neutral observers in a direction hostile to them. Why? Probably because content that agrees with our own views simply seems true and thus not very noteworthy, while material that counters our biases stands out in our minds and makes us look for a reason to reject it. So, to a conservative, news with a conservative slant is fair; to a liberal, news with a liberal slant is fair; and to both, there is something unfair somewhere in any news program that tries to balance alternative points of view.

 

‹ Prev