Book Read Free

Great Illusion

Page 7

by Paul Singh


  The brain makes assumptions about the size, distance, and relative position of objects. These assumptions are based on what is generally correct under normal circumstances, but if one or more of these assumptions are incorrect, the brain will construct a false picture of what is really there. How is it that we can see an airplane in the sky that appears to be standing completely still and looks like it is a star? A star and a plane are very different from one another; they are different sizes and they are not at the same distance from us. This tells us that our perceptions are not always reliable.

  Optical illusions are particularly important to understand because our brain has evolved to devote much of its function to the visual system which is the primary way we learn about external reality. A highly trained military helicopter pilot in 1997 confused an ordinary mylar (metallic balloon) flying by with a fast moving aircraft and took an evasive action. Again, this shows that if one of our assumptions about what our brain knows is incorrect, we—experts included—can be totally mistaken about what we think we see.

  Here is something one can try at home. Draw a picture of cube on a piece of paper (a two dimensional picture of the three dimensional object). Either face of the cube can be perceived as pointing towards you. You can try walking back and forth while looking at it from some distance.

  Believing is Seeing

  Visual and optical illusions become much more interesting when we start seeing what is not there. It is often said that seeing is believing. It turns out that we often see what we already believe is there, not what is actually in front of us. Perhaps the greatest of all examples of this are religious experiences. People with religious experiences always see in their visions what they have been taught all their lives. For example, a Muslim will never see a vision of Jesus in his religious experience and a Christian will never see Mohammad.

  Magicians take full advantage of the limitations of human perception in performing their parlor tricks. If we often see what we believe, we have to then ask ourselves, can we trust anything we see? Even bizarre experiences are possible with a brain that sees illusions. For example, in January of 2008 in Stephenville, Texas, many witnesses saw a UFO that was one mile long. It turned out that UFO observers were connecting dots of lights, in their heads, lights that were actually flares dropped by an F16. That is what our brain does; it fills in the gaps and connects the dots based on our prior pattern recognition that resides in our neuronal networks, thus constructing perceptions about the world that are not always true.

  In the famous raid over Los Angeles on February 25, 1942, soldiers who were manning artillery fired hundred of anti-aircraft shells at planes that were invading Los Angeles. Later investigation showed that there was no raid over the Los Angeles. The well- known hoax of 1909 is a classic example where tens or thousands of people, including government officials, judges, law enforcement, and military personnel saw a large plane in the sky. Even the best “aircraft expert” at that time, Alex Rivera, “saw it clearly” and described the large planes “exact dimensions.” It turned out that it was a hoax perpetrated by Wallace Tillingast, an inventor from Massachusetts, using some sort of weather balloon. We all know that the Wright brothers were still tinkering with the Kitty Hawk in 1908 and big planes such as the one described did not yet exist. Only a great deal of excitement and optimism about the new technology existed. This is a prime example how one person’s delusion can act like a contagion and spread like wildfire in the rest of the population.

  There are many flaws in the ways our brains construct what we think we see or perceive. Input from different senses can actually influence the other senses inside our brains. Our brains compare different types of sensory input in order to construct one seamless picture and they adjust one sense to make it match with the other. In a classic experiment, researchers were able to fool wine tasting experts into believing that they were tasting red wine by just putting a food-coloring into white wine to make it look red. It was even more surprising when the same wine was put into bottles with very expensive brand labels and cheap brand labels, and the top wine tasting experts praised the expensively labeled wine and did not have nice things to say about the wine with the cheap labels.

  Such research and similar other findings have profound implications for how our pre-existing expectations and our beliefs fool us into believing what is not true. This further shows that our interpretations of our experiences can change simply by what we have grown up to believe to be true, whether or not our beliefs are actually true. If these top experts cannot be trusted about what they know, how can one trust the visions of religious prophets? Religious philosophers often engage in special pleading and say that this is not a good comparison because religious prophets are receiving direct revelation from God. Special pleading allows them to keep their claims always outside the realm of scientific scrutiny.

  Carl Sagan’s tale of “A fire breathing Dragon lives in my garage” is a perfect example of how those who make scientifically unfalsifiable claims try to shift the burden of proof to others by special pleading at every step of the way. Someone tells you there is a dragon in his garage. So you take a look in the garage and you don’t see any dragon. He tells you the dragon is invisible. So you decide to spread some flour on the garage floor to detect the dragon’s foot prints, but there are none. And so it goes. Your friend will always have an excuse as to why there is evidence that there is a dragon in his garage. You tell your friend that there is no evidence that there is a dragon in his garage. But he tells you that you can’t prove there isn’t. And he is right, you can’t prove there isn’t a dragon in the garage. But that is precisely the point of Sagan’s tale. The burden of proof is not on you to prove anything. The burden of proof is on the person who makes the claim about the dragon. It is his responsibility to provide evidence for the dragon. And he can’t provide any because there isn’t any. And you can’t prove his claim wrong because his claim is unfalsifiable. And it is unfalsifiable because it makes no predictions about what you will perceive when you look in the garage. Such unfalsifiable claims are nonsense.

  This is how all superstition, pseudo-science, and religion works. Believers in such things make extraordinary claims that cannot be falsified. And when they are forced to admit that they don’t have any evidence to support their claims, they respond by saying that you cannot prove that they are wrong. But, again, the burden of proof is on the person making the extraordinary claim about miracles or UFOs or whatever to provide evidence for their claim. The burden of proof is not on the other person to prove them wrong—something that is absolutely impossible to do given that their claim is unfalsifiable.

  The take home lesson is that we should never believe a claim to be true simply because no one can prove it to be false. Theologians are experts at this kind of nonsense. Are delusional people making things up? Evidence shows that the human brain is universally delusional in many ways and therefore people who promote superstitions are not particularly more delusional than the rest of us. It is just that examples of religious delusions are rather classic examples of how the brain creates illusions and delusions. The use of logic and scientific skepticism is a skill that can be used to overcome the limitations of our own brains. This skill is like any other skill such as learning to play the piano. It involves training in metacognition as well as basic education in all basic sciences.

  Psychological research has confirmed that we cannot trust many of our sensory perceptions. For example, when a gender is ambiguous or androgynous, then what really helps us decide the gender is hearing the voice of the person. Our brains also respond to congruous sensory inputs. Our brains will make sensory inputs congruous when they are not. An example is the McGurk effect, which refers to the fact that the consonants we hear are affected by the mouth movements we see. The brain adjusts the sound that we hear to make it match the lip movements. Another example is temporal synchronization, which refers to the way the brain synchronizes sights and sounds.

  If a so
und is made by clapping hands less than 90 feet from us, our brain will synchronize the two and make it look like the sound of the clap and the actual clap were simultaneous events. The brain does this despite the fact that we know that the input from these two senses, sight and hearing, arrives at two different times. The brain takes a different amount of time to process these two inputs and yet it synchronizes them successfully to give us the illusion of simultaneity. The brain is able to pull this off as long as the source of auditory input is less than 80 milliseconds away (the equivalent of 90 feet). Once we exceed that distance, the clap will be heard after the hands are seen clapping. The split between the two sensory inputs becomes more and more dramatic with the increase in distance.

  There are neurological conditions where it is worse. For example, in a neurological condition called synesthesia, people can use their ears to see or eyes to hear, and they can taste colors and smell sounds or even numbers. A Turkish patient, an artist, who was born blind, can draw a perspective on a building if you allow him to touch the building on its entire three-dimensional periphery. We all have synesthesia to some extent but it manifests itself very dramatically when it becomes pathological. It is possible that networks dealing with sensory inputs bleed into each other. This could be due to the anatomical variations which can happen in the brain’s hard-wiring and neuronal connections just as it does with the rest of the body’s anatomical structures, routes of blood vessels, and nerves. The study of synesthesia is a work in progress in neurology today, not yet fully delineated in all respects.

  Attention

  Attention is extremely important in the act of perception. We are bombarded with sensory information every moment of the day. Research shows that our brains can barely pay attention to a very tiny fraction of what we see or perceive at any given moment. It turns out that we filter most of the information that arrives through the senses and focus only on what the brain considers to be very important or relevant. For example, when subjects are shown a video and asked to focus on one type of activity in the film, researchers note that they miss some dramatic events happening—such as a gorilla slowly walking through a small group of people passing a ball among themselves. We focus on a very small visual field in front of us and ignore all that is in the periphery or in the background. The brain actually actively suppresses peripheral vision so we will be able to focus on what is important to us. This actually allows the focused area to become enhanced. This is our brain’s way of preventing us from becoming distracted in order to successfully accomplish its intended goal. This phenomenon is called inattentional blindness, which means that we are essentially blind to what we are not attending to at a given moment in time.

  The whole process of paying attention is in the unconscious; our brain does it all, even though we, the “ghost in the machine” that we think we are, get all the credit. An extreme example of inattentional blindness is Change Blindness. After reading about this phenomenon, I personally arranged an experiment with a group of researchers in a crowded party. We set up a fifty-year-old man, named Joe (not too old, not too young) in a situation where he was talking to one of our actors, a stranger to him. After a few minutes, we distracted Joe for a second and then switched the person he was talking to with an entirely different person who was about the same height and the same age. When Joe turned his face back to his talking partner, he continued to talk without noticing that he was talking to an entirely different person.

  We cannot attend to more than one thing at a time. In fact the idea of multitasking has been found to be a myth whenever psychologists put people to the test. The brain has not evolved for multitasking. Only two percent of the subjects that were on the cell phone while driving, for example, were able to attend to driving without affecting their driving performance. Laws now exist in several states that prohibit drivers from using cell phones. Those who consider themselves to be good multitaskers were found to be actually worse multitaskers than the rest of us. They were confusing their organization skills of being able to switch back and forth from job to job with multitasking. Because of this and many other foibles of the brain, eyewitness testimony is often very unreliable. Eyewitnesses are subject to suggestion, they have a false confidence in the reliability of their memories, and they confabulate to fill in the gaps in the story. This is because our brain actively fabricates a consistent reality out of what it thinks it perceives by filling in the missing pieces subconsciously. We want our story to be always consistent.

  A classic example of this is the story of Jane Hill, who was the very first witness in John F. Kennedy’s assassination case. Although she said she saw nothing or very little in her first interview right after the assassination, she changed her story over the years when she started incorporating into her narrative everything she heard on TV from the testimonies of others. That is not something she deliberately did, it is what her brain did and that is what the brain does. None of us, including the author of this book, is an exception to this. That is why scientific inquiry is necessary to separate fact from fiction. Scientific methods are the only methods that work in separating reality from imagination.

  Perceptions are actively constructed in the brain. Sensory inputs are interpreted and then modified by the brain to fit the narrative. Different streams from different senses are combined compared, altered, and confabulated to weave a complete story that is largely fictional. Sensory inputs are constructed into meaningful patterns, patterns that make sense to us to fit our own internal model of the world that we have already constructed over time. Our brain also constructs how we put our perceptions together in a meaningful way to formulate our own unique experiences that are consistent with our group dynamics and that do not violate our deeply held beliefs.

  Deceptive Memories

  The moment we have a perception, it becomes a memory. If the perceptions are flawed, they will give rise to flawed memories. Thus we get farther and farther away from truth as skewed or false sensory inputs turn into false perceptions, which turn into false memories. These false memories are then reconstructed every time we recall them. Sometimes we actively manufacture or invent memories, which give rise to permanent cognitive biases. These formulated brain patterns lead to false experiences and sometimes even illusions and delusions. The most important fact to remember is that this snow-balling effect starts with the sensory inputs. Memories can even be fused, fabricated, or contrived by the brain. Memory is not a passive recording of events in the brain as one might expect, but is actively constructed and filtered through our beliefs that we already have come to acquire over a lifetime. Without an external scientific objective verification, therefore, we simply can’t be sure how accurate or flawed our long term memories are.

  Even flash-bulb memories have been found to be as flawed as our other memories. Flash- bulb memories tend to be vivid and long lasting and they are reinforced by the emotion of the event. Research has shown that both kinds of memories fade away at the same rate; the only difference is that people preserve their confidence in the accuracy of their flash-bulb memories. Memories of traumatic events are all flash-bulb memories by definition because of their emotional impact. Research on subjects for flash-bulb memories of 9/11 and of the Challenger’s crash on January 16, 1986, has confirmed that flash-bulb memories are as unreliable as memories of ordinary day-to-day events. Even though people are very confident in the truthfulness of their flash- bulb memories, research has shown that confidence is not a good predictor of the accuracy of our memories.

  Truth amnesia involves remembering a claim (whether true or false) much more easily than remembering the distinct facts on which the memory is based. Ian Skurnik and his colleagues conducted a study in 2007 that showed that 27 percent of young adults and four percent of the older adults misremembered a false statement thinking that it was a true statement three days after they were told it. They remembered that they had heard the statement before but they did not remember if the statement was true or false.

 
There is also a distinction between thematic memory and the memory of details. The content of the memory which is responsible for the emotional theme around the event lasts, but the details fade away much faster. Over time, the details of the memory are altered by the brain and we construct a new narrative which has emotions and themes attached to it. The memory becomes contaminated because the details introduced into the memory of the event are the details that we are exposed to after the event itself. Again, the classic example of this is the story of Jane Hill, who witnessed Kennedy’s assassination. Her testimony over the years had more and more details that were mixed up with everything she heard on television over months and years from testimonies of other witnesses. Witnesses tend to contaminate each others’ accounts by bringing them in line with each other.

  Memories can also become contaminated by someone asking leading questions by way of suggestion. For example, psychologists asked subjects what the woman was wearing in the movie they saw three days ago, knowing that there was no woman in that movie. The subjects took the suggestion and invented a belief that they had seen a woman in the movie and started describing different ways the woman was dressed. This phenomenon of truth amnesia has profound implications for religious and cultural beliefs that people adhere to so strongly, especially if they are emotional events in their lives, and even more so when they are related to their in-group beliefs.

 

‹ Prev