Book Read Free

The Self Illusion

Page 17

by Bruce Hood


  This is wrong because the processes that weigh our choices are unconscious. It may feel like you have reached your decision in the open courtroom of your mind but, in fact, most of the important stuff has been going on behind closed doors. You may be able consciously to consider choices as potential scenarios and then try to imagine what the choice would mean, but the information that is being supplied has already been processed, evaluated and weighed up well before you have had time to consider what you will do. It’s like when you say, ‘I’ve just had a great idea!’ It seems instantaneous but no light bulb suddenly went off in your head. It may have felt like a sudden enlightenment, but the boys in the backroom had been developing the idea all along and simply presented you with their analysis. Like Libet’s experiment, no single point in time marks the difference between knowing and not knowing when you are about to act. Even if you deliberate over an idea, turning it over in your conscious mind, you are simply delaying the final decision that has, to all intents and purposes, already been made.

  None of this is new. We have known since the days of psychology’s early pioneers – von Helmholtz and more famously Freud – that there are unconscious processes controlling our thoughts and behaviours.4 What is new is the extent to which these processes are there to protect the self illusion – the narrative we create that we are the ones making the decisions. This stems from the need to maintain the appearance that we are in control, even when we are not. We are so concerned with maintaining the illusion of the sovereignty of self that we are prepared to argue that black is white just to prove that we are right.

  This is why we effortlessly and sometimes unknowingly reinterpret our behaviour to make it seem that we had deliberately made the choices all along. We are constantly telling stories to make sense of our selves. In one study, adults were shown pairs of female faces and asked to choose which was the more attractive of the two women.5 On some trials, immediately after making their choice, the card with the picture of the chosen woman was held up and the participants were asked to explain why they had chosen her over the other. Was it her hairstyle or colour of her eyes? The cunning aspect of the study was that, on some of the trials, the experimenter used sleight of hand to switch the cards deliberately so that participants were asked to justify a choice they hadn’t made – to support the choice of the woman who they had actually just rejected. Not only were most switches undetected, but participants went on to give perfectly lucid explanations for why the woman was so much more attractive than the one they rejected. They were unaware that their choice was not their choice. It works for taste tests as well. When shoppers were asked to sample different jams and teas at a Swedish supermarket, again the researchers switched the products after the shoppers had selected the flavours they preferred and were asked to describe why they chose one flavour over another. Whether it was a switch from spicy cinnamon apple to sour grape jam, or from sweet mango to pungent Pernod-flavoured tea, the shoppers detected less than a third of the switches.6 It would seem that, once we have made a preference, we are committed to justifying our decision.

  This shows just how easy it is to fool our selves into thinking that our self is in control. As Steven Pinker7 put it, ‘The conscious mind – the self or soul – is a spin doctor, not the commander-in-chief.’ Having been presented with a decision, we then make sense of it as if it were our own. Otherwise, it would suggest that we don’t know what we are doing, which is not something that most of us would readily admit.

  Sour Grapes

  That we can so readily justify our choices is at the heart of one of the ancient world’s best-known stories about our necessity to spin a story. One day a hungry fox came across a bunch of grapes that hung high on a vine but, despite repeated leaping attempts to reach them, the fox was unable to dislodge the grapes. Defeated, he left saying that he did not want them anyway because they were sour. He had changed his mind. Whenever we talk disparagingly about something that we initially wanted but did not get, we are said to be displaying ‘sour grapes’. It’s very common. How often have we all done this when faced with the prospect of loss? Consider all those job interviews that you failed to get. Remember those dates that went disastrously wrong or the competition you entered and lost? We console our selves with the excuse that we did not want the job anyway, the other person was a jerk or that we were not really trying to win. We may even focus on the negative aspects of being offered the job, getting a kiss or winning the competition. But we are conning our selves. Why do we do this?

  Who would have thought that a Greek slave born over 2,500 years ago would have produced some of the most enduring commentaries on the human condition through his storytelling, which pre-empted recent theories in cognitive science? Remarkably, Aesop’s fables about animals behaving like humans endure not only because they are immediately accessible metaphors for the vagaries of human behaviour, but they also speak to fundamental truths. In the case of the fox and the sour grapes, Aesop is describing cognitive dissonance – one of the major psychological mechanisms discovered and researched over the last fifty years that has generated an estimated 3,000 plus studies.

  Cognitive dissonance, a term coined by Leon Festinger in 1957, is the process of self-justification whereby we defend our actions and thoughts when they turn out to be wrong or, as in the case of sour grapes, ineffectual.8 We interpret our failure to attain a goal as actually turning out to be a good thing because, with hindsight, we reinterpret the goal as not really desirable. Otherwise, we are faced with the prospect that we have wasted a lot of work and effort to no avail. This discrepancy creates the cognitive dissonance. It’s a dissonance because, on the one hand, we believe that we are generally pretty successful at attaining our goals. On the other hand, we were unsuccessful at achieving this particular goal. This is the dissonance aspect of our reasoning – the unpleasant mental discomfort we experience. To avoid the conflict this dissonance creates, we reinterpret our failure as a success. We tell our selves that the goal was actually not in our best interests. Job done – no worries.

  Freud similarly talked about defence mechanisms that we use to protect the self illusion. However, the self illusion sometimes has to reconcile incongruent thoughts and behaviours. For example, I may consider myself to be a good person but then have bad thoughts about someone. That is inconsistent with my good self-story so I employ defence mechanisms. I may rationalize my thoughts by saying that the person is actually bad and I am justified in my negative attitude towards them. Perversely, I may do the opposite and go out of my way to think of them positively as a compensation for my unconscious negativity in what Freud called ‘reaction formation’. Or I may project my negative feelings about a person on to their pet dog, and blame the poor mutt for my reasons of dislike, when it is actually his owner I despise. All of these are examples of why we try to reframe the unpleasant feelings that we have towards someone in order to maintain our valued sense of self – a self that is not unduly or unfairly judgemental of others.

  It is worth pointing out that not only can justification happen at the level of the self, it can also happen at the level of groups. Probably the best recent example is the justification for the Iraq War on the basis of the alleged threat from weapons of mass destruction (WMDs). The British general public was told that Saddam Hussein had missiles that could reach the mainland within the infamous forty-five-minute warning. The nation was shocked. Despite repeated assurance by United Nations inspection teams that there was no evidence for such WMDs, we were told that they were there and that we had to invade. After the invasion and once it was clear that there were no WMDs, the instigators had to justify their actions. We were told that the invasion was necessary on the grounds that Saddam Hussein was an evil dictator who needed to be removed from power, even though such regime change was in violation of international law. We were told that if he did not have WMDs before, then he was planning on making them. The invasion was justified. We had been saved. It would appear that modern politicians do not
need a thick skin so much as a carefully crafted capacity for mass cognitive dissonance.

  Cognitive dissonance protects the self from conflicting stories and is at the heart of why the self illusion is so important but it also reveals the dangers that a strong sense of self can create. We use it to justify faulty reasoning. Although we do not appreciate it, our decision-making is actually the constellation of many processes vying for attention and in constant conflict. We fail to consider just how much of our decision-making is actually out of our control.

  The Monty Hall Problem

  There are essentially two problems with decision-making: either we ignore external information and focus on our own perspective or we fail to realize the extent to which our own perspective is actually determined by external influences. In both cases we are fools to believe that our self is making decisions independent of the external context. This can lead to some wondrous distortions of human reason.

  Consider an egocentric bias that blinds us to important changes in the world when it comes to decision-making. If you have not already heard of it, then let me introduce you to the Monty Hall problem. The problem is named after the presenter of the American game show, Let’s Make a Deal, where the climax was to choose a prize from behind one of three doors. Try to imagine your self in this situation. You have made it all the way through to the final part of the show and have a chance of winning the jackpot. Behind two doors are booby prizes but behind one door is a fabulous prize. For the sake of argument, let’s say that it is a £250,000 Ferrari. You hesitate initially and then choose door A. The host of the show, Monty, says, ‘You chose door A, but let me show you what’s behind door C.’ He then opens door C to reveal one of the booby prizes. Monty says, ‘You chose door A initially, but do you want to change your decision to door B?’ What should you do? Most people who encounter this problem for the very first time think that it makes no difference, because they reason that it is a 50-50 chance to win the Ferrari with only two doors left to chose from. Indeed, people are reluctant to change their minds once they have made a choice. Some may say that we stubbornly stick with our decisions because we have the courage of our conviction. After all, it is important to be decisive.

  What do you think you should do – switch or stick? If you don’t already know, the correct answer is to switch, but if you don’t know why, it is incredibly hard to understand. The Monty Hall problem has become a somewhat famous cognitive illusion appearing both in bestselling books and even in the Hollywood movie 21 (2008), about a bunch of mathematically minded Massachusetts Institute of Technology students who counted cards at the blackjack tables of Las Vegas to beat the casinos. The correct solution to the Monty Hall problem is to switch because you are more likely to win than if you stick with your first choice. It is difficult to see at first and when it initially appeared in the popular magazine, Parade, in 1990, the problem created a storm of controversy and disagreement among both the general public and experts. Over 10,000 people (1,000 with PhDs) wrote in complaining the switch decision was false!

  The reason you should switch is that, when you first choose a door, you have a chance of one out of three that you are correct. Now, after Monty has revealed one of the booby prizes, with two doors left, the remaining door that you did not select has a one out of two chance, which has better odds than the door you first chose, which remains at one out of three. Remember, Monty always shows you an empty door. Simple – except that it is not simple for most people.

  An easier way to solve the Monty Hall problem is to consider a variation in which there are 100 doors instead of three.9 Again you get to pick one door. Now Monty opens ninety-eight out of the remaining ninety-nine doors to show you that they are all empty. There are now only two remaining unopened doors. Would you switch now? Here we can see that our door is unlikely to be the correct one. What are the odds that I correctly selected the right door on my first chance? Actually, it’s odds of 100-1 to be precise. That’s why we immediately twig that Monty is up to no good. There is something deeply counterintuitive about the Monty Hall problem, which reflects our limited capacity to think outside of the box – or to be more precise, to think in an unselfish way.

  Another reason that people fail to switch in the Monty Hall problem is a general bias not to tempt fate. When it comes to making decisions, inherently we fear loss greater than we value the prospect of a win. Despite the so-called rationality of the modern era, people still think that if they change their decision then there is more chance that they will regret doing so. It’s not so much stubbornness or superstition but rather that we fear loss greater than the potential for gains. For example, the social psychologist Ellen Langer sold $1 lottery tickets to fifty-three office workers. Each stub of the ticket was put into a box from which one lucky winner would receive the whole $53. Just before the lottery-draw a couple of days later, Ellen approached each worker and asked them for how much they would sell their ticket. If they had just been handed a ticket by the experimenter so they had exercised no choice, the average price for resale was $2, but if they had chosen the ticket themselves it was $8! Moreover, ten of the choosers and five of the non-choosers refused to sell their ticket.10 It turns out that it is the fear of regret that looms large in our minds. How many times have you deliberated over an expensive purchase only to hear the salesperson reassure you, ‘Go on, you’ll not regret it!’

  Risky Analysis

  What the Monty Hall problem illustrates so clearly is the limitations of human reasoning – especially when it comes to probability. Probability is all about external information. Reasoning in terms of probable outcomes is very difficult because most of us think in a very self-centred way. We make decisions from our own perspective and often fail to take into consideration the external information that is most relevant.

  In fact, most complex science is based on probabilities and not absolute known truths about the universe. After the age of Newton and the scientific revolution of the seventeenth century, it was assumed that the universe was one big clockwork mechanism that could be understood by measurement and prediction. It was thought that if we improved the accuracy of our measurements, then we would understand better how the universe worked. The opposite happened. The universe became more complex. With increasing efficiency, we discovered that the universe was much messier than we had previously imagined. There was more noise in the system and less certainty. This noise gave birth to the age of statistical modelling in which mathematicians tried to discover the workings of the universe using procedures that accounted, as best as possible, for all the variation that was observed. This is why the language of science is mathematics and its truths are based on probabilities.11

  Unfortunately, statistical analysis is not natural for the average man in the street. Our bodies and brains, for that matter, may operate in statistically predictable ways, but few of us explicitly understand statistical principles. This is why the general audience gets so frustrated when they hear scientists in the media refusing to give a straight ‘yes’ or ‘no’ answer to the questions that concern them. They want to know what to do about global warming, the dangers associated with childhood vaccination or how to prevent pandemic viruses. When answering, scientists talk in terms of probabilities rather than absolute certainties because they look at the big picture in which they know there is going to be some variation. That’s not what the general public wants to hear. They want to know whether vaccination will harm their children. They are less interested in the group because that is not the way individuals think.

  The other problem with probability is that humans have not evolved to consider likelihood based on large amounts of data. Rather, we operate with heuristics – fast and dirty rules of thumb that generally serve us well. The German psychologist Gerd Gigerenzer has argued that humans have not evolved to work out probabilities such as those operating in the Monty Hall problem.12 We focus on the task as relevant to our self, and how it applies on an individual basis rather than on populations
of people. We tend to only evaluate our own choices, not what is best for the group. Faced with two doors, my chances seem even. It’s only when I am faced with two doors a hundred times, or a hundred different people take the Monty Hall challenge, that the patterns become obvious.

  We often do not know the true incidence of an event but rather guess at this figure based on whatever evidence we can accumulate. This is where all sorts of distortions in reasoning start to appear. In weighing up the evidence, we easily overestimate or underestimate risks based on the external information. For example, people’s naive estimates related to dying in airplane crashes are inflated because we tend to judge the occurrence of such events as more common than they truly are. These are called ‘dread risks’ and they attract more salience because they are so uncommon. It’s not surprising considering the dramatic coverage such tragedies generate. We focus on them and imagine what it must be like to die in such a helpless way. We attach more weight to these thoughts than we should because they are novel and draw our attention.

  This inaccurate risk assessment can be potentially dangerous as we may be tempted to change our behaviour patterns based on faulty reasoning. For example, an analysis of road traffic accidents for cars travelling to New York in the three months following 9/11 showed an increase in fatalities over expected numbers for that time of year in the build up to Christmas.13 In fact, the inflated number was greater than the total number of airline passengers killed on that fateful day. Individuals frightened of flying to New York overestimated their risk and took to their cars instead, which led to heavier than usual traffic and the subsequent increase in road accidents. The most likely reason that people felt it was safer to drive was based on another illusion of the self, the illusion of control. We believe that we are safer when we think we are in control of our fate, such as when driving our own car, but feel unhappy when we are being driven by others or, worse still, flown around in a metal cylinder that can fall out of the sky, irrespective of what we do.

 

‹ Prev