Book Read Free

The Unpersuadables: Adventures with the Enemies of Science

Page 11

by Will Storr


  It is a process that has been widely studied since the 1950s, when its existence was first hypothesised by Leon Festinger of Stanford University. One of the most egregious methods the brain uses to avoid the constipation of dissonance is with a thought-flaw known as confirmation bias. The pattern, which you may also recognise, goes like this: when confronted by a new fact, we first feel an instantaneous, emotional hunch. It is a raw instinct for whether the fact is right or wrong and it pulls us helplessly in the direction of an opinion. Then we look for evidence that supports our hunch. The moment we find some, we think ‘Aha!’ and happily conclude that we are, indeed, correct. The thinking then ceases.

  Psychologists know this as the ‘makes sense stopping rule’. We ignore anything that runs counter to our hunch, grab for the first thing that matches, think, Yep that makes sense, and then we rest, satisfied by the peerless powers of our fantastic wisdom. Perhaps the most embarrassing aspect of confirmation bias is the fact that we mistake the process of searching for favourable evidence as a fair survey of both sides of the argument.

  Throughout the last few decades, a huge number of entertaining studies have been carried out that have revealed just how devious this delusion is. One of the neatest looked at unconscious sexism, and how the brain justified its secret, unpleasant prejudices to its owner. Participants were asked to consider candidates for the role of police chief. They had a choice: would a ‘streetwise’ man or a ‘formally educated’ woman be better suited to the job? The majority chose the man. When asked why, they said that they had thought carefully about this, and decided that it would be most useful for a police chief to be streetwise.

  For a second group, researchers switched the genders. This time, the male candidate was ‘formally educated’ and the female was ‘streetwise’. The majority chose the man. When asked why, they said that they had thought carefully about this, and decided that it would be most useful for a police chief to be formally educated. It is a discomforting thing, reading of these ordinary men and women, who presumably consider themselves to be kind and rational and fair, operating in such an unknowingly prejudiced manner. The study suggests that they had no idea why they believe what they believe, why they say what they say.

  We deal with dissonant information using a variety of yet more devious cognitive stunts. Psychologist Deanna Kuhn found that participants in a pseudo-murder jury quickly compose their own story of what I think happened and then proceed, as they survey the evidence, to pay attention only to the facts that fit their narrative. In an earlier study, Kuhn found that the brain has a tendency to simply forget things that it considers contradictory to its models. But perhaps the most breathtaking trick of all is in how exposure to the opposing side of any argument often makes us even more biased towards our own beliefs.

  Studies have shown that we tend to subject the evidence of our foes to much closer scrutiny than we use on our own. One had people reading two arguments about the death penalty – a first report that conflicted with their opinions and a second that agreed. Most of the participants concluded that the essay that agreed with them was a ‘highly competent piece of work’. As for the document they disagreed with, they examined it with the eye of a prosecution lawyer until they found genuine flaws and magnified them, using even minor issues as the basis for disregarding the entire thing. As Thomas Gilovich writes in How We Know What Isn’t So, ‘Exposure to a mixed body of evidence made both sides even more convinced of the fundamental soundness of their original beliefs.’ Confirmation bias is profoundly human and it is appalling. When new information leads to an increase in ignorance, it is the opposite of learning, the death of wisdom.

  Recent studies have revealed even more unpleasant truths. In 2004, clinical psychologist Drew Westen and his colleagues used the bitter US election milieu to undertake an examination of the seductive power of the lies that we all tell ourselves. They took fifteen intelligent, educated Democrat voters and fifteen equally able Republicans and slid each into a brain scanner while presenting them with six pieces of ‘information’ (some made up by the psychologists) about John Kerry and George Bush. Each slide of information showed clear inconsistencies between the politician’s words and deeds. They saw one, for example, that quoted John Kerry telling an anti-war constituent, ‘I share your concerns. I voted in favour of a resolution that would have insisted that economic sanctions be given more time to work.’ The next slide had Kerry writing to another voter, a week later: ‘Thank you for expressing your support for the Iraqi invasion. From the outset, I have strongly and unequivocally supported President Bush’s response to the crisis.’ Each of the statements was written in such a way that any dispassionate observer would rate both politicians as duplicitous. Westen wanted to find out exactly how the brains of these ordinary voters dealt with this dissonant information.

  After they had read the slides, the participants were asked to rate each politician’s level of inconsistency on a scale of one to four. As Westen writes in The Political Brain, ‘They didn’t disappoint us. They had no trouble seeing the contradictions for the opposition candidate, rating his inconsistencies close to a four. For their own candidate, however, ratings averaged closer to two, indicating minimal contradiction.’

  But that was just the beginning. Westen also had his scans to consult. He wanted to know exactly what happened on the neurological level when new data arrived that conflicted with internal models; when their minds were blasted into a state of cognitive dissonance. As he expected, the unpleasant emotion was soothed away quickly. ‘But the political brain also did something we didn’t predict,’ he writes. ‘Once participants had found a way to reason to false conclusions, not only did neural circuits involved in negative emotions turn off, but circuits involved in positive emotions turned on. The partisan brain didn’t seem just satisfied in feeling better. It worked overtime to feel good, activating reward circuits that give partisans a jolt of positive reinforcement for their biased reasoning.’

  We fall for the lies of our own brain. When we do, it rewards us. It seals its little mischief with a neurochemical kiss, drugging us into feeling good about what we have done. Of course, if we did carefully consider and fairly assess every new argument that we encountered, we would become confused, socially isolated and, quite possibly, insane. And mostly concluding that we are right about everything does have other benefits. As Tavris and Aronson so eloquently put it, ‘Dissonance reduction operates like a thermostat, keeping our self-esteem bubbling along on high.’ Indeed, a great many of the findings of decades of experimental psychology point to one grand and shameful conclusion: we are all deluded egotists.

  Humans are subject to a menagerie of biases, a troubling proportion of which hiss seductive half-truths in the ear of our consciousness. They tell us that we are better looking, wiser, more capable, more moral and have a more glittering future in store than is true. One of my favourite studies involves participants trying to find a photograph of themselves that has been hidden in a panorama of hundreds of portraits of others. People tended to find their own image more quickly when it had been digitally enhanced to make them appear more attractive, suggesting that none of us are as good looking as we really think. Discussing a related experiment, behavioural psychologist Nicholas Epley told the New Scientist, ‘When we ask people to rate how attractively they will be rated by somebody else and correlate it with actual ratings of attractiveness we find no correlation. Zero! This still shocks me.’

  Another experiment had participants reading an essay about Rasputin – one set of readers were presented with the correct text while another group had a version in which the monk’s date and month of birth had been altered to match their own. When questioned, the group with the similar birthdays generally thought more highly of the mad monk than the others, without having any clue as to why. Studies have also shown that we consistently overrate the quality and value of our own work.

  Our ego acts upon the truth as a planet acts upon gravity. Reality warps as it p
ulls towards it. A cognitive error we all share, known as the spotlight effect, means that we go through our social lives convinced that everything we are saying, doing and feeling is being closely examined by those around us even though, in reality, they are all preoccupied with themselves, equally convinced the spotlight is on them. Gamblers rewrite their memories, crediting payouts to their excellent judgement and recalling their losses as near-wins or down to simple bad luck. Athletes tend to put their victories down to training, strength and stamina and their losses to unfortunate circumstance; 74 per cent of drivers consider themselves better than average; 94 per cent of university professors think they are better than average. When husbands and wives are asked to guess what percentage of housework they do, their totals average 120 per cent. Half of all students in one survey predicted that they would protest upon hearing an overtly sexist comment. When secretly tested, just 16 per cent actually did.

  When we behave badly, it is usually because we were put in an unhappy situation. Circumstance has conspired against us. Really, I had no choice. When others do wrong, it is because of their character flaws. Professor Roy Baumeister, who specialises in the study of evil, has found that even domestic abusers and murderers tend to view themselves as having acted reasonably in the face of unfair provocation. The Nazis believed that they were on a mission of good. He writes, ‘The perpetrators of evil are often ordinary, well-meaning human beings with their own motivations, reasons and rationalisations for what they are doing … many especially evil acts are performed by people who believe they are doing something supremely good.’

  We love to judge others. We love to categorise. We love to divide. We are the good guys, they are the bad guys. We the hero, they the demon. Why? Because it fits the model. It bolsters the ego. It makes us happy. It has even been demonstrated that depressed people, with their dysfunctionally gloomy predictions about themselves and the world, are more accurate in their outlook than the mentally ‘healthy’. The world, and your life within it, is far bleaker than you have been led to believe.

  A final example should, I hope, offer some idea of the brilliant power of the lies we tell ourselves. We typically have a bias that tells us we are less susceptible to bias than everyone else. Our default position tends to be that our opinions are the result of learning, experience and personal reflection. The things we believe are obviously true – and everyone would agree if only they could look at the issue with clear, objective, unimpeded sight. But they don’t because they’re biased. Their judgements are confused by ill-informed hunches and personal grudges. They might think they’re beautiful and clever and right but their view of reality is skewed.

  You might have read all of that thinking, Yes, yes, I know people just like that. But I’m not really one of them, to be honest. I’m modest and humble and only too aware when I’m getting things wrong. That’s the sound of your brain lying to you. You are like that. If you are now thinking, Yes, yes, yes, I hear what you’re saying – but if you knew me you would realise that I’m not one of those people, I’m sorry to say that you’re still at it. Most of us think we are the exception. This most disturbing of truths has been widely demonstrated in study after study. When individuals are educated about these ego-defending biases and then have their biases re-examined, they usually fail to change their opinions of themselves. Even though they accept, rationally, that they are not immune, they still think as if they are. It is a cognitive trap that we just can’t seem to climb out of.

  Our prejudices and misbeliefs are invisible to us. They form in childhood and early adolescence, when our brain is in its heightened state of learning, when it is building its models, and then they disappear from view. We can think as long and as hard as we like about our biases – we can root about our own heads for hours, utterly convinced of our own objectivity, and still come up with nothing. They are inaccessible to the conscious part of our minds. The trick is so embedded – our warped sensations of right, wrong and truth are so folded into our fundamental sense of self – that we are immune from detecting them.

  Just as the knifefish assumes his realm of electricity is the only possible reality, just as the hominin believes his tricolour palette allows him to see all the colours, just as John Mackay is convinced that lesbian nuns are going to hell, we look out into the world mostly to reaffirm our prior beliefs about it. We imagine that the invisible forces that silently guide our beliefs and behaviour, coaxing us like flocks of deviant angels, do not exist. We are comforted by the feeling that we have ultimate control over our thoughts, our actions, our lives.

  There are seven billion individual worlds living on the surface of this one. We are – all of us – lost inside our own personal realities, our own brain-generated models of how things really are. And if, after reading all of that, you still believe you are the exception, that you really are wise and objective and above the powers of bias, then you might as well not fight it. You are, after all, only human.

  7

  ‘Quack’

  Over and over again, they told her that she was being silly. But Gemma was convinced that her doctors were mistaken. You just know, don’t you, when something is wrong, when the sensed systems inside your body nudge from their alignments. Strange shapes and colours would waft and form in her vision. She would fall asleep on the sofa and nobody could wake her. She was having difficulties in the office – her managers kept insisting they had told her what to do, but she had no memory whatsoever of them doing so. They had begun to treat her as if she was stupid. Gemma was not stupid. She had qualifications. A degree. But she didn’t feel very clever when she sat in that chair in her doctor’s surgery, desperate for him to listen. Every time she went, he would say the same thing. There is nothing whatsoever to worry about. You’re just a young girl, being silly.

  They found six small tumours, the size of thumbnails, on her brain. Oh, it’s nothing too serious, they said. They’re benign. Some people live with these kinds of things for the whole of their lives without a problem. But Gemma knew her own body. She knew her own mind. She knew that she was not the sort of person to sleep through her radio alarm clock, to courier files to the wrong office, to forget where she had parked her car. You’re panicking, being silly.

  The tumours grew. They conducted a biopsy. They drilled into her head. It took eight weeks for the results to come. You can’t imagine the terror. Two months of it. Not knowing, wondering if you might die. When the results finally arrived, they said they were benign. Harmless. Fine. Silly.

  They found new tumours – these ones on her spine. She was alone when they called her. She telephoned her parents, but nobody was home. Her boyfriend wasn’t picking up either. None of her friends were in. Evening had fallen before she was able to speak to anyone. All of those hours, alone with the news.

  Chemotherapy made her sick. Over the course of a single weekend, all of her hair fell out. The tumours grew in size and threat. They gave her steroids. She gained four stones in one month. She had an extended stomach, a great big puffed-out moon face. She had to lift her eyelids with her finger if she wanted to see anything. Her bowels didn’t move for weeks. She had a wheelchair, a stick. Her sight became so bad that she couldn’t watch television or read. She had nothing to do but to lie there, terrified in her nauseous gloaming. She thought, I’m only twenty-six. I’m the youngest of seven children. The youngest! It’s not my turn. Early in the winter of 1995, the oncologist visited her hospital bed. He said something strange. ‘Okay, Gemma, these are your options. You can stay here, you can go to a hospice or you can go home.’

  Gemma was groggy, confused. She reasoned, ‘Well, sick people go to hospital. Dying people go to a hospice. And home – that’s for fit people.’

  She was delighted.

  ‘Home, please!’

  ‘Fine,’ said the doctor, kindly. ‘You’ve got those little pills and you’ve got him up there. Make sure you have a happy Christmas.’

  What an odd thing to say. Have a happy Christmas? It was o
nly October. It was some time before Gemma realised that this was her doctor’s way of telling her that they had been wrong all along. That her tumours were, in fact, malignant. That she had cancer and would be dead within four months.

  She felt betrayed. She had done everything they had asked of her. Medicine was like a, b, c, wasn’t it? You got sick, they treated you and then you got better. It wasn’t supposed to be like this.

  Despite her bleak prognosis and her new medication, which now only treated her symptoms, Gemma carried on taking the ‘little pills’ that her oncologist had mentioned with his gently knowing smile. To her amazement, they seemed to work. By Christmas, her eyelids had opened up. Her bowels began to move. Her sight returned. And the more of the little pills that she took, the better she became. A year later, Gemma called her oncologist’s office and asked why they hadn’t been in touch. She was angry. She knew why – it was because they had assumed that she was dead. And who were they anyway? They’re not God. They don’t decide when I’m going to die. When her oncologist next examined her, he wrote in his notes, ‘Gemma has made a remarkable recovery. Her case will remain a mystery.’

  ‘But it’s not a mystery to you, is it?’ I say to Gemma, who has been telling me her story in the front room of her modest Sutton Coldfield home.

  ‘Not to me,’ she smiles.

  The ‘little pills’ Gemma Hoefkens had been taking were homeopathic. She believes that they not only saved her life, they also changed its direction forever. She is now a licensed homeopath who claims to have not seen a doctor for fourteen years.

  The industry that Gemma works in is worth four million pounds a year in the UK and billions in Europe and the US. Over fifteen thousand NHS prescriptions are issued for it annually, it sells in high-street chemists and user-satisfaction ratings in Britain score above 70 per cent. And yet Gemma’s oncologist was not alone in his reservations over their efficacy. Throughout its defiant 230-year history, homeopathy has attracted the disbelieving fury of doubters from Richard Dawkins today all the way back to Charles Darwin, who wrote, ‘In homeopathy common sense and common observation come into play and both these must go to the dogs.’ Over the last few years, a campaign to stop the homeopaths has gathered into a truly damaging force. Questions have been asked in Parliament. In February 2010, the House of Commons Science and Technology Committee recommended the NHS cease funding the discipline. Even ex-Prime Minister Tony Blair has become involved, saying ‘My advice to the scientific community would be, I wouldn’t bother fighting a great battle over homeopathy.’ But they do and, at least in Britain, they are winning: between 2000 and 2011 there was an eightfold drop in NHS prescriptions. It now comprises just 0.001 per cent of the NHS’s annual drug budget. And Gemma has suffered personally at the feet of reason’s furious armies.

 

‹ Prev