* * *
This Is Your Brain on Politics
During the 2004 presidential campaign, the Emory University psychologist Drew Westen and his colleagues conducted brain scans of fifteen Bush supporters and fifteen Kerry supporters, who were asked to evaluate statements attributed to each candidate. The researchers told the subjects that Kerry had reversed his position on overhauling Social Security, and they said Bush flip-flopped on his support for the former chief executive of Enron, Ken Lay.
Not surprisingly, each group judged the other’s candidate harshly but let its own candidate off fairly easy—clear evidence of bias. More interesting was what the brain scans showed. “We did not see any increased activation of the parts of the brain normally engaged during reasoning,” Westen said in announcing his results. “What we saw instead was a network of emotion circuits lighting up.”
Fig. 1: Emotional centers active when processing information unfavorable to the partisan’s preferred candidate
Furthermore, after the partisans had come to conclusions favorable to their candidates, their brain scans showed activity in circuits associated with reward, rather as the brains of addicts do when they get a fix. “Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it,” Westen said.
Fig. 2. Reward centers active when processing information that gets the partisan’s preferred candidate off the hook
Westen’s experiment supplies physical evidence that emotionally biased thinking may be hard-wired into our brains.
Images from Drew Westen, Journal of Cognitive Neuroscience(2006), MIT Press Journals, © by the Massachusetts Institute of Technology.
* * *
Psychologists call this phenomenon confirmation bias, and it not only colors how we see things, but how we reason as well. David Perkins, a professor of education at Harvard, has aptly called it “myside bias.” In studies of how people reason when asked to think about a controversial issue, Perkins observed a strong tendency for people to come up with reasons favoring their own side, and not even to think about reasons favoring the other. His test subjects offered three times more considerations on their own side of an issue as they did against their position, and that count included arguments they brought up just for the sake of shooting them down.
The University of Pennsylvania psychologist Jonathan Baron found a classic example of myside bias in a Daily Pennsylvanian student article in favor of abortion rights, which said: “If government rules against abortion, it will be acting contrary to one of the basic rights of Americans,…the right to make decisions for oneself.” The author of that sentence was oblivious to the thought that the other side sees abortion as equivalent to murder, and that laws against homicide also interfere with “the right to make decisions for oneself” when the decision is to commit murder.
When Baron asked fifty-four University of Pennsylvania students to prepare for a discussion of the morality of abortion, he found, as expected, that they tended to list arguments on only one side of the question. What was even more revealing, the students who made one-sided arguments also rated the arguments of others as being of better quality when those other arguments were all on one side, too, even arguments on the opposing side. He concluded: “People consider one-sided thinking to be better than two-sided thinking, even for forming one’s own opinion on an issue.”
General Norman Schwarzkopf fell into the confirmation bias trap after leading U.S. forces to one of the most lopsided military victories in history during the Gulf War in 1991. As Colin Powell tells it in his autobiography, My American Journey, Schwarzkopf appeared at a news conference with video that he said showed a U.S. smart bomb hitting Iraqi Scud missile launchers. When Powell informed him that an analyst had identified the targets as fuel trucks, not missile launchers, Schwarzkopf exploded. “By God, those certainly were Scuds. That analyst doesn’t know what he’s talking about. He’s just not as good as the others.” Powell says later examination showed that the analyst was right. Schwarzkopf just couldn’t see it. Believing that his forces were really hitting Scud launchers, he was open only to evidence that confirmed his belief.
Even scholars are affected by this powerful bias. In the 1980s, the National Institute of Education (NIE) asked six scholars to conduct an analysis of existing research into the effects of desegregated schools. Two of the scholars were thought to favor school integration, two to oppose it, and two to be neither opponents nor proponents. Sure enough, the differences in their findings were consistent with their ideological predispositions. The differences were slight, which is a testament to the power of the scientific method to rein in bias. But the bias was there nonetheless.
Once you know about confirmation bias, it is easy to detect in others. Confirmation bias was at work when CIA analysts rejected evidence that Iraq had really destroyed its chemical and biological weapons and gave weight only to signs that Saddam Hussein retained hidden stockpiles. Confirmation bias explains why so many people believe in psychics and astrologers: they register only the apparently accurate predictions and ignore those that miss. Confirmation bias explains why, once someone has made a bad first impression on a date or during a job interview, that impression is so hard to live down. And it is because of confirmation bias that good scientists try actively to disprove their own theories: otherwise, it would be just too easy to see only the supporting evidence.
To avoid this psychological trap, apply a bit of the scientific method to political claims and marketing messages. When they sound good, ask yourself what fact could prove them untrue and what evidence you may be failing to consider. You may find that a partisan or dogmatic streak is keeping you from seeing facts clearly.
The “I Know I’m Right” Trap
There’s evidence that the more misinformed we are, the more strongly we insist that we’re correct. In a fascinating piece of research published in 2000, the political psychologist James H. Kuklinski and his colleagues reported findings from a random telephone survey of 1,160 Illinois residents. They found few who were very well informed about the facts of the welfare system: only 3 percent got more than half the questions right. That wasn’t very surprising, but what should be a warning to us all is this: those holding the least accurate beliefs were the ones expressing the highest confidence in those beliefs.
Of those who said correctly that only 7 percent of American families were getting welfare, just under half said they were very confident or fairly highly confident of their answer. But 74 percent of those who grossly overestimated the percentage of those on welfare said they were very confident or fairly highly confident, even though the figure they gave (25 percent) was more than three times too high. This “I know I’m right” syndrome means that those who most need to revise the pictures in their heads are the very ones least likely to change their thinking. Of such people, it is sometimes said that they are “often in error but never in doubt.”
The “Close Call” Trap
Psychological research shows that when we are confronted with tough decisions and close calls, we tend to exaggerate the differences. The psychologist Jack Brehm demonstrated this in a famous experiment published in 1956. He had women rate eight different products such as toasters and coffeemakers, then let them keep one—but allowed them to choose between only two of the products. He set up some of the decisions as “close calls,” between two products the women had rated alike; others were easy calls, with wide differences in ratings. After the women had made their choices, Brehm asked them to rate the products again. This time, women who had been forced to make a tough choice tended to be more positive about the product they had picked and less positive about the one they had rejected. This change was less evident among women who had made the easy call.
Psychologists call this the “spreading of alternatives” effect, a natural human tendency to make ourselves feel better about the choices we have made, even at the expense of accuracy or c
onsistency. We crave certainty, and don’t want to agonize endlessly about whether we made the right call. This mental habit helps us avoid becoming frozen by indecision, but it also can make changing our minds harder than need be when the facts change, or when we have misread the evidence in the first place. Once in a while we need to ask, “Would I feel this way if I were buying this product (or hearing this argument) for the first time? Have new facts emerged since I made my initial decision?”
It’s easy to fall into traps like the ones we’ve described here, because people manage most of the time on automatic pilot, using mental shortcuts without really having to think everything through constantly. Consider a famous experiment published by the Harvard psychologist Ellen Langer in 1978. She and her colleagues repeatedly attempted to cut in front of persons about to use a university copying machine. To some they said, “Excuse me. May I use the Xerox machine, because I’m in a rush?” They were allowed to cut in 94 percent of the time. To others, the cheeky researchers said only, “Excuse me. May I use the Xerox machine?,” without giving any reason. These succeeded only 60 percent of the time. So far, that’s what you would probably expect: we’re likelier to accommodate someone who has a good reason for a request than someone who just wants to push ahead for their own personal convenience. But here’s the illuminating point: Langer showed that giving an obviously bogus reason worked just as well as giving a good one. When Langer’s cohorts said, “Excuse me, may I use the Xerox machine, because I have to make some copies?” they were allowed to cut in 93 percent of the time.
“Because I have to make some copies” is really no reason at all, of course. Langer’s conclusion is that her unwitting test subjects reacted to the word “because” without really listening to or thinking about the reason being offered; they were in a state she called “mindlessness.”
Others have demonstrated the same zombielike tendency, even among university students who supposedly are smarter than average. Robert Levine, a psychology professor at California State University, Fresno, tried different pitches during a campus bake sale. Asking “Would you like to buy a cookie?” resulted in purchases by only two out of thirty passersby. But his researchers sold six times more cookies when they asked “Would you like to buy a cookie? It’s for a good cause.” Of the thirty passersby who were asked that question, twelve bought cookies. And none even bothered to ask what the “good cause” was.
Marketers use the insights from such studies against us. An Internet-based salesman named Alexi Neocleous tells potential clients that Langer’s study shows “because” is “a magic word [that] literally forces people to buckle at the knees and succumb to your offer.” He adds, “The lesson for you is, give your prospects the reason why, no matter how stupid it may seem to YOU!”
The lesson we should draw as consumers and citizens is just the opposite: watch out for irrelevant or nonexistent reasons, and make important decisions attentively. “Mindlessness” and reliance on mental shortcuts are often fine; we probably won’t go far wrong buying the most popular brand of soap or toothpaste even if “best-selling” doesn’t really mean “best.” Often the most popular brand is as good a choice as any other. But when we’re deciding on big-ticket items, it pays to switch on our brains and think a bit harder.
How can we break the spell? Research shows that when people are forced to “counterargue”—to express the other side’s point of view as well as their own—they are more likely to accept new evidence rather than reject it. Try what Jonathan Baron, of the University of Pennsylvania, calls active open-mindedness. Baron recommends putting initial impressions to the test by seeking evidence against them as well as evidence in their favor. “When we find alternatives or counterevidence we must weigh it fairly,” he says in his book Judgment Misguided. “Of course, there may sometimes be no ‘other side,’ or it may be so evil or foolish that we can dismiss it quickly. But if we are not open to it, we will never know.”
That makes sense to us. We need to ask ourselves, “Are there facts I don’t know about because I haven’t looked for them? What am I missing here?” Otherwise, we’re liable to end up like Mrs. Keech’s UFO cultists, preaching with utter conviction that Guardians from the Planet Clarion really do exist, or like the blustering General Schwarzkopf angrily denying the truth about those burned-out tanker trucks. It’s better to be aware of our own psychology, to know that our brains tend to “light up” to reinforce our existing beliefs when we hear our favorite candidates or positions challenged. To avoid being deceived (or deceiving ourselves) we have to make sure the pictures in our heads come as close to reflecting the world outside as they reasonably can.
Chapter 5
Facts Can Save Your Life
GETTING THE FACTS RIGHT IS IMPORTANT. IT CAN SAVE YOUR money, your health, even your freedom.
We’re not exaggerating one bit. Consider the story of Daniel Bullock, a California physician who got spun by a sleazy tax-shelter promoter and then received some unwelcome visitors carrying badges and guns. “My seventeen-year-old daughter answered the door to some armed federal agents from the [IRS] criminal investigation division,” he recalled. “That was a bad day.” And worse followed: Bullock lost his medical license and served eight months in a federal prison camp, all because he had failed to check the facts when a smooth-talking promoter sold him what turned out to be a criminal tax-evasion scam.
Bullock was a churchgoing orthopedic surgeon from Mount Shasta, California, who did volunteer work in Central America. But like a lot of people, he hated paying his taxes, and resented the stories he had heard about how others avoided taxes entirely. “When I encountered someone with ‘inside information’ on how the very wealthy avoid taxes I was all ears,” Bullock told a Senate subcommittee in 2002, after he had started serving his sentence. “He had a good story, a well used and ‘successful’ strategy, hundreds of clients and legal opinions in support of his program.” Bullock fell into the “I don’t want to hear it” trap. He was so convinced that his benefactor had discovered a legal way to avoid paying taxes that he failed to look for evidence to the contrary.
Bullock bought into a preposterous scheme that sent his earnings on a round trip to the Caribbean through a series of offshore banks and “nonprofit” trusts. These entities returned the money to him by paying his mortgage and other personal living expenses. That made Bullock’s money hard for the IRS to trace but did nothing to erase his legal obligation to pay taxes on it. It was just a scam, as was clear to Bullock’s bookkeeper, who eventually turned him in to the IRS. The judge who sentenced Bullock thought it should have been apparent to anybody. “Any ten-year-old would know this was obviously illegal,” he said.
Bullock might have avoided prison by practicing a little “active open-mindedness” and asking himself how likely it was that the IRS would actually allow this kind of dodge. He might also have called someone who didn’t stand to make money on the scheme, unlike the promoter who collected thousands of dollars in fees for his “services.” Bullock’s own lawyer or accountant might have quickly set him straight about what a dim view the IRS takes of sham transactions and international money-laundering. Bullock might also have conducted a quick Internet search on the name of the promoter who sold him the scheme, which might have turned up the fact that the man had already been convicted on seven counts of aiding and assisting the filing of false tax returns. Today, a simple search for the term “tax schemes” brings up page after page of official warnings about similar scams and the prosecutions that often result. Try it yourself.
Bullock’s story shows that letting bad information go unchallenged can have grave consequences. He is not alone. Thousands of people have used tax-evasion schemes like the one he fell for, and while most don’t go to prison they all risk being forced to pay fines, big penalties, and interest on top of the back taxes when they are caught. Some may know they are engaging in criminal activity, but we suspect the majority are like Bullock, actively obtuse when it comes to matters that improve
their tax returns. And it hardly matters whether we have been deceived by somebody else or have failed to check out our own dubious assumptions. The price for getting the facts wrong is the same whether we are deceived or self-deceived, misinformed or disinformed, spun by others or spun by the pictures in our head. The message here is simple: facts matter.
The “Grey Goose Effect”
Most bad information won’t land you in jail or ruin your career, but much of it will cost you money. For proof of that, look no further than the snake-oil hustles we mentioned in Chapter 1. We can be manipulated into spending too much money in more subtle ways, too. For example, we tend to think of higher-priced goods as being of better quality than lower-priced goods; but while “You get what you pay for” may be common folk wisdom, it isn’t always true. In the 1950s, Pepsi competed with Coca-Cola by selling its soda at half the price of Coke and advertising “twice as much for a nickel.” But more people bought Pepsi after it raised its price, a lesson not lost on other marketers. A formerly obscure brand of Scotch whiskey also increased its sales by raising its price, giving its name to what is now known as the Chivas Regal effect.
unSpun Page 8