Book Read Free

Believing

Page 6

by Michael McGuire


  The list is revealing with regard to both the beliefs of past practitioners and their creativity. Only phrenology—the study of the brain as a composite of various organs that localize social, moral, and intellectual qualities—in a revised and updated version has stood the test of time and found its place in today’s neuroscience. But this is not to say that many forms of pseudoscience were or are simply poppycock.

  Certainly charlatans and the mentally deranged have earned their place in the annals of pseudoscience. Still, a close look at the research and reasoning of the practitioners is informative. It reveals their often-serious attempts to verify their beliefs and speculations using era-prevailing standards for experimentation and the interpretation of evidence. Evidence was respected and influential in leading to new methods and explanations of their findings. That is, they tried to reduce divides.41

  It is now clear that highly complex and sophisticated intellectual patterns guided much of the thinking that underpinned the fields of astrology, white magic, alchemy, even witchcraft.42 Much of it would be dismissed today, particularly that which incorporates beliefs infused with a sturdy dose of mysticism or possibly yet-to-be-identified laws of the universe.

  Still, much of the evidence associated with these beliefs is acceptable. Consider alchemy. Many of its views can be traced to events we observe daily, such as things changing under specific conditions.43 For example, water turns to ice when the temperature drops low enough. Burned wood produces heat, disappears, and becomes ash and smoke. Seeds grow into shrubs and trees and vegetables. Lizards change color. Substances such as clay or iron ore change their nature when heated. And dead loved ones communicate in dreams.

  As the historian Steven Sapin has argued convincingly: “Science is a cultural activity that is an integral element of the societies in which it is practiced, and whose basic mores and conventions it shares.”44 Pseudoscience, at least some of it, closely fits Sapin’s description.

  In passing, it is worth noting some of the individuals who embraced these sciences: Albertus Magnus (the only philosopher to be called “The Great” and a mentor of Thomas Aquinas), Thomas Aquinas, Arnaldus de Villa­nova, Pope John XXII, Cornelius Agrippa, Paracelsus, Lord Alfred Russel Wallace, and Isaac Newton. Is there a more impressive roster of intellects?

  While pseudoscience was most influential during past centuries—mainly those bracketed by the Renaissance—its attraction remains. Today relatively few people believe in teleportation or alien abduction. Yet it’s a good bet that a third of the world’s pet owners (including the author) ascribe to some form of animal psi, perhaps a tenth to stigmata and exorcism, and there are today vast and profitable industries to satisfy those committed to managing their lives with astrology and tarot cards.

  An interview with a thirty-six-year-old female teacher-artist who is a university graduate, holds a master’s degree in art, and who was brought up in a family with no religious affiliation provides insight into why some pseudosciences persist:

  Author: “You say you believe in astrology. Fill me in if you would.”

  Interviewee: “I believe the sun, moon, and rising signs tell me things about others’ personalities.”

  Author: “For example?”

  Interviewee: “I don’t trust Gemini. They’re two-faced. They can be nice one moment and treacherous the next. And I don’t like Scorpios. They are secretive and suspicious.”

  Author: “You know people with these signs?”

  Interviewee: “Yes.”

  Author: “You’re confident their signs describe their personalities?”

  Interviewee: “They do, nearly always.”

  Author: “What would you do if you met a person whom you liked who was not two-faced, and then you found out he or she was a Gemini?”

  Interviewee: “I’d be very cautious, even if I liked the person. I would wait to see if the second face appeared.”

  Author: “How do you explain the relationship between astrological signs and personality?”

  Interviewee: “There is order in the universe that we can’t sense or know. But it can be realized in its signs.”

  Author: “I don’t understand. Could you elaborate?”

  Interviewee: “It’s [the order in the universe] analogous to gravity. You can’t see or touch gravity, yet it affects everything we do. I mean, you can see the results of gravity—stones always fall down, not up. The order works the same way. You can see the results in personality.”

  Author: “Where did your ideas come from?”

  Interviewee: “I thought you might ask that. In high school, I went to an astrologist. She told me that she could tell me the names of my friends if I could provide their signs. I did. And she correctly named three of my friends. It was amazing, truly amazing. She knew nothing about me before I visited her. I’ve believed in astrology ever since.”

  If personal experience is evidence—and it is—the preceding interview illustrates why, for many people, the belief-evidence divide is far narrower than might be predicted and preempts alternative views and evidence. It’s easy to reason with direct evidence just as it is hard to reason against it or without it.

  Critical points emerge from the history of belief. The most critical is this: whatever their source, the brain is always involved, which is to say that beliefs—all beliefs—are context bound. They are products not of a logical or rational system but systems that put things together in ways that reflect the brain’s evolved internal structure and default operations.45 These evolved structures appear to dictate that believing something—at times, literally anything—is a fundamental default feature of the brain and that narrowing divides facilitates belief acceptance and longevity.

  Evidence is the bedrock of the justice system and of scientific research. We also often imagine that it is the foundation of our own beliefs. My investigations into psychology and history yielded few satisfying answers, and I was still in a morass of contradictions when I flew to California to visit my father. Prior to his retirement, he was the president of an insurance company, and during the summers as a teenager, I worked for his company under the supervision of a claims adjustor named Joe. The experience was an eye-opener.

  Joe had gone to work as a stock boy when he was sixteen. Two years later, he was transferred to the claims department. By the time we met, he had been there thirty years and was the company’s chief claims adjustor with a reputation in the industry as a person with uncanny powers for identifying false claims. That summer we traveled around Los Angeles daily to interview claimants and assess their losses. For Joe, claims work was about evidence and how it ties to events.

  Joe liked to eat, so we had a lunch routine. It was Mexican food on Monday, Chinese on Tuesday, German on Wednesday, hamburgers on Thursday, and on Friday, I got my choice. At lunch, we would review our findings from the previous afternoon and that morning, make decisions about the claims, and then hit the road for more interviews. His thoroughness was amazing. Literally no detail was overlooked—details that eluded me. His motto was: “If you know how the world works, spotting false claims is a cinch.”

  Dad and I reminisced about those days.

  “I loved working with Joe. It was an education. And he always claimed I was doing a fine job, no doubt because you were president? What’s happened to him?”

  Dad pushed back his gray hair and smiled. “It’s a classic Joe story about evidence and interpretation. When he retired, he bought a large piece of land in the foothills of the Sierra Nevada to devote his time to animal conservation. After a couple of years, he decided to increase the bird population. First, he counted the number of birds. Then he planted sunflowers, corn, and grapes, things the birds living in the area like to eat. A year later, he counted the birds again and the number had increased significantly. He was delighted.”

  We paused to make some coffee.

  Dad continued. “It was about this time I visited him. With great enthusiasm he told me of his success, how he had established a cl
ear connection between increasing the food supply and the number of birds. It seemed straightforward. I was convinced.

  “But during my visit, Joe’s friend Dave—I don’t think you ever met him—dropped by. Dave works for the state’s agriculture department and is an expert on birds. Joe told him of his success. Dave lauded his efforts but was skeptical. Joe wanted to know why.

  “Dave was reluctant to respond. But Joe insisted. Eventually Dave provided a list of other possible explanations. I don’t recall his exact words, but it went something like this: Migration patterns might have changed. Alternative feeding sites might have been compromised. Predator reduction might have allowed more than the usual number of young birds to survive. Events on the property such as a reduction in the use of machines or pesticides might have made the property more inviting.”

  “And how did Joe deal with that?” I asked.

  “He wasn’t happy.”

  “And then?”

  “You know Joe. He never gives up easily. The next year, he counted the birds again, planted even more vegetables, checked migration patterns and predator prevalence, and increased the use of machines. More birds appeared compared to the year before.”

  “And?”

  “I talked with him last week. He mentioned that Dave was now partially satisfied with his explanation.”

  Academic careers, like justice, often rise or fall on evidence. Historians are supposed to get their facts right. Chemists have to identify the elements with which they are dealing. And so forth. But academics aren’t alone when it comes to the importance of evidence and its interpretation. Everyone is involved, from cooks to taxi drivers to the police, and, of course, courts and juries spend endless hours identifying, classifying, and sifting evidence.

  Except for the most obvious explanations, such as sticking a finger through the hole in the roof through which water is dripping into the kitchen, there are usually alternative ways of explaining what we take as evidence. And, if we are honest with ourselves, we rarely consider these explanations once we have a belief to which we’ve tied the evidence, whatever its quality.

  The brain uses transparent systems for organizing and interpreting evidence. Well-known examples of the workings of these systems include intuition, inference, and various types of logic. The matter doesn’t rest there, however. There are other systems, many of which are idiosyncratic and often inconsistent.

  It was time for me to address evidence. Since my days with Joe, I had been living with it. Yet I hadn’t asked certain critical questions. What is it? Are there different types? If there are, how are they best described? And, of course, how does evidence tie to beliefs? None of these questions would lead to quick or crystal-clear answers.

  POSTMODERNISTS

  As postmodernists view it, life is largely “narrative.” Even if we can’t find truth or meaning in any “objective reality,” we can still create meaning by constructing our own narratives and telling each other stories. Explanations of events are never exact. Theories are nothing more than speculations. As to evidence—something that supposedly furnishes proof—a fundamental mistake among scientists and philosophers over the last several centuries has been their conviction that there is such a thing as objective truth.1 Despite scientific claims to the contrary, wide and often-ignored divides separate their explanations from evidence.

  Such postmodernist trademarks are critiques primarily of modern science and its supposed progress since the Enlightenment. Unstated givens and assumptions that scientists have used to validate scientific findings since the late eighteenth century are invalid, particularly so in the social sciences. Scientific thinking and reasoning are forms of self-­deception based on the assumption that there is a clear and identifiable distinction between the self and physical reality, which postmodernists insist there isn’t. Facts and concepts don’t exist separately from the processes of thinking and speaking about them. They are subject to the influences the brain exerts on information. They’re just stories about ourselves, our beliefs, and what people take for reality. That’s as good as it gets.

  It’s no secret that scientists, philosophers, and most everyone else engage in discussions about how best to describe and interpret evidence and identify divides that separate evidence from belief and explanation. It is also no secret that no scientific fact or explanation is immune from reevaluation and reinterpretation. The Newtonian constant of gravitation, which has remained largely unchanged for centuries, is now being revisited due to findings suggesting alternative values. Gravity soon may be reformulated as a special form of entropy and information storage.2 Recently, not only has the decades-old explanation of chemical bonds come into question, but also a change in the atomic periodic table that would assign ranges in atomic weights to certain elements is on the agenda for the future.3

  Further, if pushed, most scientists would agree with Kant’s distinction between the appearance of things that are filtered through our senses and things-in-themselves (“objective reality”), which are unknowable. In effect, there are limits to what scientists can know due to the way the brain processes information. This is a pivotal assertion on which the postmodernist thesis tries to balance. Its primary implication is not what it might seem at first, however; namely, all scientific findings are invalid if things-in-­themselves are taken as benchmarks. To argue this way is tantamount to asserting that nothing can be known about “objective reality” because we don’t know it completely and accurately. The critical point is this: science has developed and applies a methodology that systematically reduces the probability of connecting explanations with irrelevant evidence. The method narrows divides when experimental findings are repeatedly confirmed. It widens divides when they are not. Over time, things-in-themselves can slowly change from unknowables to partial knowables. The progressive specification of the human genetic system can be characterized this way.4 True, there are limits to what scientists can know. Yet these limits don’t invalidate scientific methodology, its findings and explanations, and the predictions its explanations make possible.5

  While provocative, the views of postmodernists are primarily those of intellectuals and artists who believe that much of the thought and evidence that has preceded them is invalid and improvident. Their ideas are only tangential to the themes discussed here, however. Why? Because they have failed to significantly inform or alter the goings-on of daily life. That rocks fall down and not up, that certain conditions are required for seed germination, and that a fire warms a cold room have been unchanged examples of such belief-evidence relationships for centuries. Claiming that water as we normally experience and understand it is not a thing-in-itself is an interesting metaphysical assertion, but from the perspective of daily life, it is little more than that. Moreover, many explanations such as why water freezes, why wood combusts and creates heat, and why seeds germinate not only accurately predict events but also have withstood the test of time. Chance or random events can hardly explain this history, which is a point postmodernists eschew.

  Still the postmodernists shouldn’t be dismissed out of hand. Their claim that the brain moves relentlessly toward narrative and storytelling seems accurate. There is also a sturdy ring of truth to their view when scientists assume the mantra of “experts” when they consult with institutions and government agencies. They frequently conflate personal convictions, questionable evidence, untested beliefs, and elements of cultural myths in their efforts to explain and justify social policy and design social interventions. If evidence or beliefs are questionable, the decisions and predictions to which they lead will be inexact. Much the same may be said about the contributions of scientists to TV programs such as Nature, NOVA, and those on the History channel and the National Geographic channel. A few pieces of often-questionable evidence serve as departure points for an elaborate rendering of a this-may-be-true story with the unstated implication that what is presented may accurately depict objective reality.

  DIRECT EVIDENCE

  E
vidence is information that can be used to justify or nullify a belief. It is not a new topic. Most likely every human being past and present has had a say about it and, at times, has struggled with what it is or might be. It is a topic of discussion as often as the weather, and it can be as slippery as walking on an icy pavement.

  Multiple types of evidence exist. Three are of interest here: direct, indirect, and circumstantial.

  As the term is used here, direct evidence is the evidence of personal experience. It ranges from simple observations—a faucet is leaking—to reviewing the findings of a complex experiment, to highly stressful situations involving the full range of emotional and cognitive responses experienced when, for example, one is a passenger on an airplane in mechanical distress.

  Used this way, the definition is consistent with what is observed among young children or adults who move to a novel culture or ecological setting. Largely through trial-and-error experiences and the gradual accumulation of direct evidence, they piece together an understanding of their environment, its inhabitants, and customary and expected behavior. From the early moments of life to its final hours, the importance of direct evidence as a guide to living remains high.

  Yet much of what the brain processes while experiencing an event may go unrecognized and remain hidden from awareness for years. The following incident captures this point.

  Years ago, I was a member of a group searching for a lost Mayan city in the jungle of northern Guatemala. There was much discussion among group members about possible dangers posed by poisonous snakes, army ants, and jaguars.

  One evening, while bathing in a stream near the camp, I sensed that a jaguar was perched on the ledge above me. I turned, and there it was. “I don’t have my underpants on” was my first thought. It passed quickly. The next thought went something like this: “Jaguars are a type of cat . . . cats don’t like water . . . dive into the stream and swim to camp.” This I did, and at that moment, there was no divide between my thoughts and the action that followed. Back in camp, I related the event to members of the group. As I told them about the jaguar, I had no sense of fear or anxiety. Nor were they present for several years when I recalled the event for others.

 

‹ Prev