Book Read Free

Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time

Page 10

by Michael Shermer


  13. Coincidence

  In the paranormal world, coincidences are often seen as deeply significant. "Synchronicity" is invoked, as if some mysterious force were at work behind the scenes. But I see synchronicity as nothing more than a type of contingency—a conjuncture of two or more events without apparent design. When the connection is made in a manner that seems impossible according to our intuition of the laws of probability, we have a tendency to think something mysterious is at work.

  But most people have a very poor understanding of the laws of probability. A gambler will win six in a row and then think he is either "on a hot streak" or "due to lose." Two people in a room of thirty people discover that they have the same birthday and conclude that something mysterious is at work. You go to the phone to call your friend Bob. The phone rings and it is Bob. You think, "Wow, what are the chances? This could not have been a mere coincidence. Maybe Bob and I are communicating telepathically." In fact, such coincidences are not coincidences under the rules of probability. The gambler has predicted both possible outcomes, a fairly safe bet! The probability that two people in a room of thirty people will have the same birthday is .71. And you have forgotten how many times Bob did not call under such circumstances, or someone else called, or Bob called but you were not thinking of him, and so on. As the behavioral psychologist B. F. Skinner proved in the laboratory, the human mind seeks relationships between events and often finds them even when they are not present. Slot-machines are based on Skinnerian principles of intermittent reinforcement. The dumb human, like the dumb rat, only needs an occasional payoff to keep pulling the handle. The mind will do the rest.

  14. Representativeness

  As Aristotle said, "The sum of the coincidences equals certainty." We forget most of the insignificant coincidences and remember the meaningful ones. Our tendency to remember hits and ignore misses is the bread and butter of the psychics, prophets, and soothsayers who make hundreds of predictions each January 1. First they increase the probability of a hit by predicting mostly generalized sure bets like "There will be a major earthquake in southern California" or "I see trouble for the Royal Family." Then, next January, they publish their hits and ignore the misses, and hope no one bothers to keep track.

  We must always remember the larger context in which a seemingly unusual event occurs, and we must always analyze unusual events for their representativeness of their class of phenomena. In the case of the "Bermuda Triangle," an area of the Atlantic Ocean where ships and planes "mysteriously" disappear, there is the assumption that something strange or alien is at work. But we must consider how representative such events are in that area. Far more shipping lanes run through the Bermuda Triangle than its surrounding areas, so accidents and mishaps and disappearances are more likely to happen in the area. As it turns out, the accident rate is actually lower in the Bermuda Triangle than in surrounding areas. Perhaps this area should be called the "Non-Bermuda Triangle." (See Kusche 1975 for a full explanation of this solved mystery.) Similarly, in investigating haunted houses, we must have a baseline measurement of noises, creaks, and other events before we can say that an occurrence is unusual (and therefore mysterious). I used to hear rapping sounds in the walls of my house. Ghosts? Nope. Bad plumbing. I occasionally hear scratching sounds in my basement. Poltergeists? Nope. Rats. One would be well-advised to first thoroughly understand the probable worldly explanation before turning to other-worldly ones.

  Logical Problems in Thinking

  15. Emotive Words and False Analogies

  Emotive words are used to provoke emotion and sometimes to obscure rationality. They can be positive emotive words—motherhood, America, integrity, honesty. Or they can be negative—rape, cancer, evil, communist. Likewise, metaphors and analogies can cloud thinking with emotion or steer us onto a side path. A pundit talks about inflation as "the cancer of society" or industry "raping the environment." In his 1992 Democratic nomination speech, Al Gore constructed an elaborate analogy between the story of his sick son and America as a sick country. Just as his son, hovering on the brink of death, was nursed back to health by his father and family, America, hovering on the brink of death after twelve years of Reagan and Bush, was to be nurtured back to health under the new administration. Like anecdotes, analogies and metaphors do not constitute proof. They are merely tools of rhetoric.

  16. Ad Ignorantiam

  This is an appeal to ignorance or lack of knowledge and is related to the burden of proof and unexplained is not inexplicable fallacies, where someone argues that if you cannot disprove a claim it must be true. For example, if you cannot prove that there isn't any psychic power, then there must be. The absurdity of this argument comes into focus if one argues that if you cannot prove that Santa Claus does not exist, then he must exist. You can argue the opposite in a similar manner. If you cannot prove Santa Claus exists, then he must not exist. In science, belief should come from positive evidence in support of a claim, not lack of evidence for or against a claim.

  17. Ad Hominem and Tu Quoque

  Literally "to the man" and "you also," these fallacies redirect the focus from thinking about the idea to thinking about the person holding the idea. The goal of an ad hominem attack is to discredit the claimant in hopes that it will discredit the claim. Calling someone an atheist, a communist, a child abuser, or a neo-Nazi does not in any way disprove that person's statement. It might be helpful to know whether someone is of a particular religion or holds a particular ideology, in case this has in some way biased the research, but refuting claims must be done directly, not indirectly. If Holocaust deniers, for example, are neo-Nazis or anti-Semites, this would certainly guide their choice of which historical events to emphasize or ignore. But if they are making the claim, for example, that Hitler did not have a master plan for the extermination of European Jewry, the response "Oh, he is saying that because he is a neo-Nazi" does not refute the argument. Whether Hitler had a master plan or not is a question that can be settled historically. Similarly for tu quoque. If someone accuses you of cheating on your taxes, the answer "Well, so do you" is no proof one way or the other.

  18. Hasty Generalization

  In logic, the hasty generalization is a form of improper induction. In life, it is called prejudice. In either case, conclusions are drawn before the facts warrant it. Perhaps because our brains evolved to be constantly on the lookout for connections between events and causes, this fallacy is one of the most common of all. A couple of bad teachers mean a bad school. A few bad cars mean that brand of automobile is unreliable. A handful of members of a group are used to judge the entire group. In science, we must carefully gather as much information as possible before announcing our conclusions.

  19. Overreliance on Authorities

  We tend to rely heavily on authorities in our culture, especially if the authority is considered to be highly intelligent. The IQ score has acquired nearly mystical proportions in the last half century, but I have noticed that belief in the paranormal is not uncommon among Mensa members (the high-IQ club for those in the top 2 percent of the population); some even argue that their "Psi-Q" is also superior. Magician James Randi is fond of lampooning authorities with Ph.D.s—once they are granted the degree, he says, they find it almost impossible to say two things: "I don't know" and "I was wrong." Authorities, by virtue of their expertise in a field, may have a better chance of being right in that field, but correctness is certainly not guaranteed, and their expertise does not necessarily qualify them to draw conclusions in other areas.

  In other words, who is making the claim makes a difference. If it is a Nobel laureate, we take note because he or she has been right in a big way before. If it is a discredited scam artist, we give a loud guffaw because he or she has been wrong in a big way before. While expertise is useful for separating the wheat from the chaff, it is dangerous in that we might either (1) accept a wrong idea just because it was supported by someone we respect (false positive) or (2) reject a right idea just because it was support
ed by someone we disrespect (false negative). How do you avoid such errors? Examine the evidence.

  20. Either-Or

  Also known as the fallacy of negation or the false dilemma, this is the tendency to dichotomize the world so that if you discredit one position, the observer is forced to accept the other. This is a favorite tactic of creationists, who claim that life either was divinely created or evolved. Then they spend the majority of their time discrediting the theory of evolution so that they can argue that since evolution is wrong, creationism must be right. But it is not enough to point out weaknesses in a theory. If your theory is indeed superior, it must explain both the "normal" data explained by the old theory and the "anomalous" data not explained by the old theory. A new theory needs evidence in favor of it, not just against the opposition.

  21. Circular Reasoning

  Also known as the fallacy of redundancy, begging the question, or tautology, this occurs when the conclusion or claim is merely a restatement of one of the premises. Christian apologetics is filled with tautologies: Is there a God? Yes. How do you know? Because the Bible says so. How do you know the Bible is correct? Because it was inspired by God. In other words, God is because God is. Science also has its share of redundancies: What is gravity? The tendency for objects to be attracted to one another. Why are objects attracted to one another? Gravity. In other words, gravity is because gravity is. (In fact, some of Newton's contemporaries rejected his theory of gravity as being an unscientific throwback to medieval occult thinking.) Obviously, a tautological operational definition can still be useful. Yet, difficult as it is, we must try to construct operational definitions that can be tested, falsified, and refuted.

  22. Reductio ad Absurdum and the Slippery Slope

  Reductio ad absurdum is the refutation of an argument by carrying the argument to its logical end and so reducing it to an absurd conclusion. Surely, if an argument's consequences are absurd, it must be false. This is not necessarily so, though sometimes pushing an argument to its limits is a useful exercise in critical thinking; often this is a way to discover whether a claim has validity, especially if an experiment testing the actual reduction can be run. Similarly, the slippery slope fallacy involves constructing a scenario in which one thing leads ultimately to an end so extreme that the first step should never be taken. For example: Eating Ben & Jerrys ice cream will cause you to put on weight. Putting on weight will make you overweight. Soon you will weigh 350 ounds and die of heart disease. Eating Ben & Jerrys ice cream leads to death. Don't even try it. Certainly eating a scoop of Ben & Jerry's ice cream may contribute to obesity, which could possibly, in very rare cases, cause death. But the consequence does not necessarily follow from the premise.

  Psychological Problems in Thinking

  23. Effort Inadequacies and the Need for Certainty, Control, and Simplicity

  Most of us, most of the time, want certainty, want to control our environment, and want nice, neat, simple explanations. All this may have some evolutionary basis, but in a multifarious society with complex problems, these characteristics can radically oversimplify reality and interfere with critical thinking and problem solving. For example, I believe that paranormal beliefs and pseudoscientific claims flourish in market economies in part because of the uncertainty of the marketplace. According to James Randi, after communism collapsed in Russia there was a significant increase in such beliefs. Not only are the people now freer to try to swin-die each other with scams and rackets but many truly believe they have discovered something concrete and significant about the nature of the world. Capitalism is a lot less stable a social structure than communism. Such uncertainties lead the mind to look for explanations for the vagaries and contingencies of the market (and life in general), and the mind often takes a turn toward the supernatural and paranormal.

  Scientific and critical thinking does not come naturally. It takes training, experience, and effort, as Alfred Mander explained in his Logic for the Millions: "Thinking is skilled work. It is not true that we are naturally endowed with the ability to think clearly and logically—without learning how, or without practicing. People with untrained minds should no more expect to think clearly and logically than people who have never learned and never practiced can expect to find themselves good carpenters, golfers, bridge players, or pianists" (1947, p. vii). We must always work to suppress our need to be absolutely certain and in total control and our tendency to seek the simple and effortless solution to a problem. Now and then the solutions may be simple, but usually they are not.

  24. Problem-Solving Inadequacies

  All critical and scientific thinking is, in a fashion, problem solving. There are numerous psychological disruptions that cause inadequacies in problem solving. Psychologist Barry Singer has demonstrated that when people are given the task of selecting the right answer to a problem after being told whether particular guesses are right or wrong, they:

  A. Immediately form a hypothesis and look only for examples to confirm it.

  B. Do not seek evidence to disprove the hypothesis.

  C. Are very slow to change the hypothesis even when it is obviously wrong.

  D. If the information is too complex, adopt overly-simple hypotheses or

  strategies for solutions.

  E. If there is no solution, if the problem is a trick and "right" and "wrong" is

  given at random, form hypotheses about coincidental relationships they observed. Causality is always found. (Singer and Abell 1981, p. 18)

  If this is the case with humans in general, then we all must make the effort to overcome these inadequacies in solving the problems of science and of life.

  25. Ideological Immunity, or the Planck Problem

  In day-to-day life, as in science, we all resist fundamental paradigm change. Social scientist Jay Stuart Snelson calls this resistance an ideological immune system: "educated, intelligent, and successful adults rarely change their most fundamental presuppositions" (1993, p. 54). According to Snelson, the more knowledge individuals have accumulated, and the more well-founded their theories have become (and remember, we all tend to [ look for and remember confirmatory evidence, not counterevidence), the greater the confidence in their ideologies. The consequence of this, however, is that we build up an "immunity" against new ideas that do not corroborate previous ones. Historians of science call this the Planck Problem, after physicist Max Planck, who made this observation on what must happen for innovation to occur in science: "An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning" (1936, p. 97).

  Psychologist David Perkins conducted an interesting correlational study in which he found a strong positive correlation between intelligence (measured by a standard IQ test) and the ability to give reasons for taking a point of view and defending that position; he also found a strong negative correlation between intelligence and the ability to consider other alternatives. That is, the higher the IQ, the greater the potential for ideological immunity. Ideological immunity is built into the scientific enterprise, where it functions as a filter against potentially overwhelming novelty. As historian of science I. B. Cohen explained, "New and revolutionary systems of science tend to be resisted rather than welcomed with open arms, because every successful scientist has a vested intellectual, social, and even financial interest in maintaining the status quo. If every revolutionary new idea were welcomed with open arms, utter chaos would be the result" (1985, p. 35).

  In the end, history rewards those who are "right" (at least provisionally). Change does occur. In astronomy, the Ptolemaic geocentric universe was slowly displaced by Copernicus's heliocentric system. In geology, George Cuvier's catastrophism was gradually wedged out by the more soundly supported uniformitarianism of James Hutton and Charles Lyell. In biology, Darwin's evolution theory superseded c
reationist belief in the immutability of species. In Earth history, Alfred Wegener's idea of continental drift took nearly a half century to overcome the received dogma of fixed and stable continents. Ideological immunity can be overcome in science and in daily life, but it takes time and corroboration.

  Spinoza's Dictum

  Skeptics have the very human tendency to relish debunking what we already believe to be nonsense. It is fun to recognize other people's fallacious reasoning, but that's not the whole point. As skeptics and critical thinkers, we must move beyond our emotional responses because by understanding how others have gone wrong and how science is subject to social control and cultural influences, we can improve our understanding of how the world works. It is for this reason that it is so important for us to understand the history of both science and pseudoscience. If we see the larger picture of how these movements evolve and figure out how their thinking went wrong, we won't make the same mistakes. The seventeenth-century Dutch philosopher Baruch Spinoza said it best: "I have made a ceaseless effort not to ridicule, not to bewail, not to scorn human actions, but to understand them."

  PART 2

  PSEUDOSCIENCE

  AND

  SUPERSTITION

  Rule 1

  We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances.

  To this purpose the philosophers say that Nature does nothing in vain, and more is in vain when less will serve; for Nature is pleased with simplicity, and affects not the pomp of superfluous causes.

  —Isaac Newton, "Rules of Reasoning in Philosophy," Principia Mathematica, 1687

 

‹ Prev