The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us
Page 18
Do people actually prefer expressions of knowledge that exude more certainty to more tentative statements, even when the tentative ones are better calibrated? Try answering the following simple question devised by the Dutch psychologist Gideon Keren:
Listed below are four-day weather forecasts for the probability of rain, made by two meteorologists, Anna and Betty:
As it turned out, it rained on three out of the four days. Who, in your opinion, was a better forecaster: Anna or Betty?
This question pits our preferences for accuracy and certainty against each other. Betty said it should rain 75 percent of the time, and it did, so her predictions reflected no illusion of knowledge. Anna thought she knew more about the likelihood of rain than she really did: It would have to have rained on all four days for her forecasts to be more accurate than Betty’s. When we conducted an experiment using a variant of this question, nearly half of our subjects, however, preferred Anna’s forecast.42
The conditions of this experiment differ from most real-world situations, in which we rarely get to choose among experts with such clear track records of success or failure in prediction. A study of experts on international politics—a field in which it can take years or decades to see whether predictions are borne out—found that their forecasts were significantly less accurate than those of simple statistical models. The way the forecasts were worse was revealing: In general, the experts predicted that political and economic conditions would change (for the better or the worse) more often than they actually did. So a strategy of simply assuming that the future will be the same as the present would have yielded more accurate predictions (but probably less airtime for the pundit). Unlike the weather forecasting experiment, though, people listening to these political experts have no way to tell in advance how accurate their forecasts will be.43 Compared to the laboratory, in the real world it’s much harder to make a correct choice, precisely because we either lack the necessary information, or we have it but lack the time, attention, and insight we need to evaluate it properly.
The Anna/Betty experiment shows that even when we have all the necessary information to recognize which expert knows the limits of her own knowledge, we often prefer the one who does not. Self-help authors who say precisely what to do (“eat this, not that”) have larger audiences than those who give a menu of reasonable options for readers to try out in order to find out what works best for them. TV stock-picking guru Jim Cramer tells you to “buy buy buy” or “sell sell sell” (with a hearty “Boo-yah!”) rather than to analyze investment ideas in the context of your overall financial goals, weighting of different types of assets, and other nuanced considerations that might undermine the dazzling sense of conviction that he exudes.44
So the illusion of knowledge persists in part because people prefer experts who think they know more than they really do. People who know the limits of their knowledge say things like “there is a 75 percent chance of rain,” while people who don’t know those limits express undue certainty. Yet even those with the best understanding of their field can fall prey to the illusion of knowledge. Recall the scientists who made misguided predictions about the number of human genes, the limits of natural resources, and the promise of chess-playing computers. These scientists were far from marginal figures or failures in their fields. Eric Lander, who mispredicted the number of human genes, and John Holdren, who wrongly forecast ever-rising commodity prices, went on to become science advisers in Barack Obama’s administration. Paul Ehrlich received a MacArthur Foundation “genius” award worth $345,000 in 1990, the same year he lost his bet about commodity prices. And Herbert Simon won the Nobel Prize in economics in 1978—for his “pioneering research on the decision-making processes within economic organizations,” not for his ability to forecast the results of chess matches.45
In none of these cases did the illusion of knowledge cost people their livelihoods, but in others it has. The archetype of the successful investor is not someone who hedges his bets carefully and makes sure that his asset allocation and leverage reflect an appropriate level of uncertainty about the future. It is one who makes bold moves—who gambles it all and wins. The illusion of knowledge is so strong that we eagerly welcome back into the fold people who win for a while and then go too far and lose it all. In 2007, despite his disastrous losses at Amaranth and Deutsche Bank, and despite having been formally charged with market manipulation by the U.S. government, Brian Hunter was raising capital for a new hedge fund—as did the disgraced founders of Long-Term Capital Management and other failed funds before him.46
jumping to conclusions
ON MAY 29, 2005, a six-year-old girl was hospitalized in Cincinnati, where she’d been visiting relatives. She was dehydrated, had a fever and a rash, and had to spend days in the hospital on a ventilator. The hospital sent a blood sample to the Ohio State Department of Health Laboratory for testing, and the result confirmed their initial diagnosis: She had measles.1
Measles is among the most infectious viruses affecting children. When a person with measles sneezes, another person can contract the disease just by breathing the air in the room or touching a contaminated surface—the virus remains active for up to two hours. The rash is the first visible evidence that distinguishes the measles infection from other viruses, but the disease is contagious for four days before the rash appears. Moreover, someone exposed to measles might show no symptoms at all for up to two weeks.
The combination of delayed onset of symptoms, the potential for carriers to spread the disease before they know they are infected, and the highly infectious nature of the virus itself creates a perfect recipe for epidemics. Before the 1970s, measles was so prevalent, even in the United States, that it was unusual for children not to get it. It’s still prevalent throughout much of the world; according to the World Health Organization (WHO), nearly two hundred thousand people died from measles infection in 2007 alone, and it remains a leading cause of death in children worldwide.
Serious complications of the disease include blindness, severe dehydration, diarrhea, encephalitis, and pneumonia. In poorer, developing countries with inadequate health care and high rates of malnutrition, measles outbreaks can be catastrophic; the WHO estimates death rates as high as 10 percent from outbreaks in such regions. In wealthier countries with effective health care systems, measles rarely causes death, but it can cause serious complications for people with existing health problems like asthma.
The elimination of measles is one of the great success stories for programs of systematic vaccination. Cases of measles in the United States are exceptionally rare today because of the effectiveness of the combination MMR vaccine that inoculates against measles, mumps, and rubella. Mandatory MMR vaccination of children before they enter the public school system largely eliminated measles from the United States by the year 2000. Vaccination levels of 90 percent of the population are needed to effectively prevent epidemics, and the United States has exceeded that threshold for more than a decade. So how did a six-year-old girl in Cincinnati get the disease?
Measles is still endemic in parts of Europe where vaccination programs are voluntary, and full-scale epidemics are common in Africa and parts of Asia. Most cases of measles in the United States are isolated—an unvaccinated person visits a country where an outbreak is underway, is exposed to the virus, returns home, and then starts to show symptoms. The girl visiting Cincinnati lived in northwest Indiana and hadn’t been out of the country. So how did she get it?
Because measles can be contagious for so long before symptoms appear, it can be transmitted by people who don’t know they have it. Even if this girl hadn’t been to a region where measles is endemic, she could have unknowingly encountered someone who had. She most likely was infected about two weeks earlier, on May 15, when she attended a large gathering with about five hundred members of her Indiana church. Her parents reported to Cincinnati hospital workers that one of the teenagers at the gathering was sick—she had a fever, a cough, and conjunctiviti
s (colloquially known as “pink eye”). As it turned out, that seventeen-year-old girl had just returned to Indiana following a church mission in Bucharest, the capital of Romania, where she’d worked in an orphanage and hospital. She had traveled on commercial flights to get back to the United States on May 14 and attended the church gathering the next day. She was the “index case”—the first person to be infected, and thus the source of the infections in all of the later patients—in what quickly became the biggest measles outbreak in the United States since 2000.
During May and June of 2005, another 32 people contracted measles. Of these 34 documented cases, 33 were church members who either came into direct contact with the seventeen-year-old index case or lived in the same house as someone who had. The only person who contracted measles outside of the church community worked at a hospital where one of the patients was treated. Fortunately, none of those infected died from the disease. In addition to the six-year-old girl in Cincinnati, a forty-five-year-old man needed intravenous fluids and the hospital worker needed six days of ventilator support because of pneumonia and respiratory distress. Through effective treatment and management of the outbreak—anyone exposed to the virus who hadn’t yet shown symptoms had to be quarantined for eighteen days—the outbreak was contained by the end of July, with no new cases reported after then. By one estimate, the total cost of the containment and treatment efforts was nearly $300,000.2
Only two of the 34 patients had been vaccinated, and one of those two—the hospital worker—had only received one dose of the vaccine. The six-year-old girl hadn’t been vaccinated, nor had the seventeen-year-old who had traveled to Romania. In the gathering of 500 people, 50 were unvaccinated, and 16 of those 50 subsequently got measles. The outbreak was containable because most of the community members had been vaccinated. In countries where vaccination is less common, the outbreak would have been much larger.
Why were 10 percent of the church members unvaccinated when the vaccination rate for school-age children in the United States is over 95 percent? Although vaccination is mandatory for all children attending public schools in the United States, in many states, parents can file a “personal belief exemption” that allows them to forgo vaccination for their children for religious or other reasons. And in fact, most of the measles cases occurred in a few families that had declined inoculation. Many of these families continued to refuse vaccination even as health authorities were trying to control the outbreak.
The 2005 Cincinnati outbreak was not unique. During the first seven months of 2008, the Centers for Disease Control (CDC) documented 131 cases of measles in the United States, more than double the yearly average from 2001 through 2007, and the highest number since 1996. Most of the cases occurred among schoolchildren who are eligible for vaccination but whose parents declined to have them vaccinated.
Why would parents knowingly reject a vaccine that could prevent a serious and highly contagious childhood illness, one that had been effectively eradicated by that same vaccine? Why would people knowingly violate CDC and WHO guidelines by traveling to foreign countries where measles and other preventable diseases are prevalent without first vaccinating themselves? Why would parents expose their children to potentially deadly diseases like measles when a safe and effective vaccine has been available for more than forty years?
This behavior, as we’ll discover, is the result of another everyday illusion—the illusion of cause. Before we can understand why people would choose not to vaccinate their children, we must first consider three separate, but interrelated, biases that contribute to the illusion of cause. These biases arise from the fact that our minds are built to detect meaning in patterns, to infer causal relationships from coincidences, and to believe that earlier events cause later ones.
Seeing the God in Everything
Pattern perception is central to our lives, and skill in many professions is based almost entirely on the ability to rapidly recognize a large variety of important patterns. Doctors look for combinations of symptoms that form a pattern, allowing them to infer an underlying cause, make a diagnosis, select a treatment, and predict their patient’s outcome. Clinical psychologists and counselors look for patterns in thoughts and behaviors to help diagnose mental dysfunction. Stock traders follow the ups and downs of the major indices, looking for consistencies that will give them an advantage. Baseball coaches decide where to position their players in the field based on regularities in where batters tend to hit the ball, and pitchers adjust their pitching based on the patterns they perceive in a batter’s swings. All of us use pattern detection without even knowing that we’re doing it. We can identify people we know from no more information than characteristic regularities in their gaits. Just by picking up patterns of movement and gesture from brief silent videos, students can even predict which teachers are likely to receive good ratings at the end of a semester.3 We can’t help but see patterns in the world and make predictions based on those patterns.
These extraordinary pattern detection abilities often serve us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations. Unfortunately, they can also lead us astray, contributing to the illusion of cause. At times, we perceive patterns where none exist, and we misperceive them where they do exist. Regardless of whether a repeating pattern actually exists, when we perceive that it does, we readily infer that it results from a causal relationship. Much as our memory for the world can be distorted to match our conceptions of what we should remember, and just as we can fail to see the gorillas around us because they do not fit with our preexisting expectations, our understanding of our world is systematically biased to perceive meaning rather than randomness and to infer cause rather than coincidence. And we are usually completely unaware of these biases.
The illusion of cause arises when we see patterns in randomness, and we are most likely to see patterns when we think we understand what is causing them. Our intuitive beliefs about causation lead us to perceive patterns consistent with those beliefs at least as often as the patterns we perceive lead us to form new beliefs. Some of the most striking examples of pattern perception gone awry involve the detection of faces in unusual places.
One day in 1994, Diana Duyser saw something strange after she bit into a grilled cheese sandwich she had just made. Etched into the surface of the toasted bread, staring back at her, was a face. Duyser, a jewelry designer in South Florida, immediately recognized the face as that of the Virgin Mary. She stopped eating the sandwich and stored it in a plastic box, where it remained, miraculously mold-free, for ten years. Then, for unknown reasons, she decided to sell this religious icon on eBay. The Internet gambling site GoldenPalace.com put in the winning bid of $28,000 and sent its CEO to personally pick up the purchase. In handing it over, Duyser was quoted as saying, “I do believe that this is the Virgin Mary Mother of God.”4
The human mind’s tendency to promiscuously perceive meaningful visual patterns in randomness has a one-word name: pareidolia. Like the Virgin Mary Grilled Cheese, many examples of pareidolia involve religious images. The “Nun Bun” was a cinnamon pastry whose twisty rolls eerily resembled the nose and jowls of Mother Teresa. It was found in a Nashville coffee shop in 1996, but was stolen on Christmas in 2005. “Our Lady of the Underpass” was another appearance by the Virgin Mary, this time in the guise of a salt stain under Interstate 94 in Chicago that drew huge crowds and stopped traffic for months in 2005. Other cases include Hot Chocolate Jesus, Jesus on a shrimp tail dinner, Jesus in a dental x-ray, and Cheesus (a Cheeto purportedly shaped like Jesus). Islam forbids images of Allah, but followers in West Yorkshire, England, have noticed the word “Allah” written out, in Arabic, in the veiny material inside a sliced-open tomato.
You won’t be surprised to learn that we favor a mundane explanation for all of these face sightings. Your visual system has a difficult problem to solve in recognizing faces, objects, and words. They can all appear in
a wide variety of conditions: good light, bad light, near, far, oriented at different angles, with some parts hidden, in different colors, and so on. Like an amplifier that you turn up in order to hear a weak signal, your visual system is exquisitely sensitive to the patterns that are most important to you. In fact, visual areas of your brain can be activated by images that only vaguely resemble what they’re tuned for. In just one-fifth of a second, your brain can distinguish a face from other objects like chairs or cars. In just an instant more, your brain can distinguish objects that look a bit like faces, such as a parking meter or a three-prong outlet, from other objects like chairs. Seeing objects that resemble faces induces activity in a brain area called the fusiform gyrus that is highly sensitive to real faces. In other words, almost immediately after you see an object that looks anything like a face, your brain treats it like a face and processes it differently than other objects. That’s one reason why we find it so easy to see facelike patterns as actual faces.5
The same principles apply to our other senses. Play Led Zeppelin’s “Stairway to Heaven” backward and you may hear “Satan,” “666,” and some other strange words. Play Queen’s “Another One Bites the Dust” backward and the late Freddie Mercury might tell you “it’s fun to smoke marijuana.” This phenomenon can be exploited for fun and profit. A writer named Karen Stollznow noticed a faint outline on a Pop-Tart that could be interpreted as the miter-style hat traditionally worn by the pope. She snapped a digital photo, uploaded it to eBay, and opened up bidding on the “Pope Tart.” Over the course of the auction she exchanged numerous entertaining e-mails with believers and skeptics. By the end, the winning bid was $46. She attributed the relatively low price paid for the Pope Tart to a lack of publicity, as compared with the press releases and television coverage received by the Virgin Mary Grilled Cheese.6