A Mind of Its Own
Page 12
There is something really very eerie about the power of other people’s beliefs to control you without your knowledge. But there is little you can do to protect yourself against an enemy whose potency resides in its very imperceptibility. But even creepier, surely, is the prospect that your own pessimistic convictions could be insidiously working against you. A woman’s expectations for how her relationship will turn out, for example, may ‘create her own reality’.13 If she were excessively concerned about a romantic partner’s true commitment to the relationship and overly preoccupied with the possibility of rejection by him, could a woman’s hypersensitive reactions to conflict in her relationship bring about the very outcome she feared? In a test of this hypothesis, psychologists invited couples to place the dynamics of their relationship under microscopic scrutiny. Both members of each couple separately rated their feelings about their partner and their relationship, their satisfaction with it and their commitment. They also filled out a questionnaire that probed for anxieties about rejection from ‘significant others’ before they were brought together again and seated in a room with a video camera pointed at them. Next, to create a little interesting conflict, they were asked to discuss an issue in their relationship that tended to chill atmospheres and fray tempers. Then – just to see what effect this rattling of each other’s cages had had – they once again separately rated their emotions about their loved one. Once they had both safely departed from the laboratory, other psychologists (who did not know what the experiment was about) did what we all wish we could do as we rake through the ashes of a scorching argument. They reran the tape to comb for unambiguous evidence of scratchy comments, nasty put-downs, hostile gestures, or unpleasant tones of voice.
Before the videotaped discussions, the partners of rejection-sensitive women were just as positive about their relationship as the partners of women with a more robust attitude towards relationships. But afterwards, the partners of the touchier women were quietly fuming. The researchers discovered the reason for this in the videotapes. The women who feared rejection behaved more cantankerously during the airing of conflict-ridden issues and, according to the researchers’ statistical analyses, it was this that was so exasperating their partners. Enough to dissolve the relationship? It seemed so. A second experiment showed that the relationships of rejection-sensitive women – despite being just as healthy and happy to begin with – were nearly three times more likely to end than those of women who took conflict in their stride. Expecting rejection, these more vulnerable women behaved in ways that turned their fears into reality.
So far, our reluctance to survey the world with an open mind seems to have little to recommend it. Are there any potential benefits to be had from our obduracy? Psychologists have pointed out that a modicum of obstinacy in relinquishing our beliefs is only sensible. After all, we would end up in rather a flap if our beliefs were forever fluctuating in response to every newspaper report or argument with an in-law. There’s also a sense in which our important beliefs are an integral part of who we are. To bid a belief adieu is to lose a cherished portion of our identity.14 Interestingly, people who have recently indulged in extensive contemplation of their best qualities (or been ‘self-affirmed’, to use the cloying terminology of the literature) are more receptive to arguments that challenge their strongly held beliefs about issues like capital punishment and abortion. By hyping up an important area of self-worth, you are better able to loosen your grip on some of your defining values. (Just loosen your grip, mind. Not actually let go.) It’s a curious, and somewhat disquieting, fact that effusive flattery dulls the sword of an intellectual opponent far more effectively than mere logical argument.
It would be much more pleasant to leave it at that: we’re pigheaded, yes, but it’s for good reasons. However, research shows that our stubbornness is so pernicious that even the most groundless and fledgling belief enjoys secure residence in our brains. As a consequence, we are at the mercy of our initial opinions and impressions. In a classic demonstration of this, some volunteers were given a test of their social sensitivity.15 They read a series of pairs of suicide notes and for each pair they had to guess which note was genuine and which was a fake. Some volunteers were then arbitrarily told that their social-sensitivity performance was superior, others that it was inferior. A little later the experimenter debriefed the volunteers. The experimenter explained that the feed back they’d been given about their social sensitivity was made up, and that their supposed score had been randomly decided before they even walked into the lab. Any ideas the volunteers had developed about their proficiency in discriminating between genuine and fake suicide notes should have been abolished by the debriefing. After all, the evidence on which those beliefs were based had been entirely discredited. But still, the volunteers continued to believe in their superior or inferior social sensitivity. When the experimenter asked the volunteers to guess how well they would actually do on this and other similar tasks, their answers reflected whether they had been given ‘superior performance’ or ‘inferior performance’ false feedback on the suicide notes task. What is particularly remarkable about this experiment is that even people who were told that they were social clodhoppers carried on believing it. Even though their vain brains had been handed a bona fide rationale on which to restore their self-esteem, they continued to believe the worst about themselves.
In a similar experiment, researchers gave high school students training in how to solve a difficult mathematical problem.16 Half of the students watched a clear and helpful video presentation. The other half watched a deliberately confusing video presentation that left them floundering. Unsurprisingly, these latter students wound up feeling pretty crestfallen over their ham-handedness with numbers. This lack of confidence persisted even after the researchers showed them the clear video presentation and explained that their poor maths performance was due to the bad instruction, not to their actual ability. Even three weeks later, the students unfortunate enough to have watched the baffling video presentation were less likely to show interest in signing up for other similar maths classes. And so, possibly, the entire course of their future lives was changed.
Indeed, at this point you may be beginning to feel uneasy stirrings about the ethics of psychology researchers giving false feedback – particularly negative feedback – to unsuspecting volunteers. The first chapter of this book, ‘The Vain Brain’, bulged with experiments in which unsuspecting volunteers were told something unpleasant about their personalities, skills, future prospects or health. To be sure, the experimenters always debriefed the hapless volunteers afterwards, but it looks as if this alone isn’t enough. The researchers in the suicide notes experiment discovered that normal debriefing procedures are hopelessly ineffective in correcting pigheadedly held beliefs. Only by painstakingly explaining the belief-perseverance phenomenon, and describing how it might affect the volunteer, were the experimenters able to leave their volunteers in the same psychological condition in which they found them.
This is a little worrisome – although evidently not to psychology researchers. Of course, you can see it from a researcher’s point of view. Yes, you tell some helpful person who has kindly agreed to help you in your research that, oh dear, he’s scored embarrassingly low on a test compared with almost everyone else who’s ever passed through the lab. But then, probably less than an hour later, you clearly explain that what you told him wasn’t true, that you didn’t even trouble to mark his test. It’s hard to credit that this might be insufficient to rid even the most self-doubting individual of any lingering doubts.
Clearly, however, normal debriefing is strangely inadequate. Why is it that beliefs take such an immediate and tenacious grasp of our brains? One answer is that our rich, imaginative and generally spurious explanations of things are to blame. You hear a rumour that a friend’s teenager is pregnant. Discussing her dubious situation with another friend, you sadly call attention to the parents’ regrettable insistence on treating adolescents as if they
were adults, the laissez-faire attitude of the mother towards curfews, and the risqué clothes in which they let their daughter appear in public. In the face of such parental licence, the young woman’s predicament takes on a tragic inevitability. As a result, when you subsequently learn that the rumoured pregnancy concerned someone else’s daughter, you find yourself thinking that it is only a matter of time before the slandered girl suffers the same misfortune. You may even comment, with the satisfying (if, in your case, misguided) confidence of Cassandra, that, ‘There’s no smoke without fire.’ The initial belief recruits its own web of supporting evidence, derived from the facile causal explanations that we’re so good at creating (and which, let’s be honest, are so much fun to indulge in). You can then take the initial fact away. The web of explanation is strong enough to support the belief without it.
In an experiment that simulated just this kind of gossipy social reasoning, volunteers were given a real clinical case history to read.17 One case study, ‘Shirley K.’, was an anxious young mother and housewife whose history included such misfortunes as divorce, the suicide of her lover, her father’s death, and the eventual commitment of her mother to a mental institution. Some of the volunteers were then asked to put themselves in the role of a clinical psychologist who had just learned that Shirley K. had subsequently committed suicide. They were asked what clues, if any, they found in Shirley K.’s life story that might help a psychologist explain or predict her suicide. The volunteers embraced this task with enthusiasm. They easily came up with plausible-sounding hypotheses; for example, that the suicide of her lover was ‘a model that led her to take her own life’. Once the volunteers had done this they were told that in fact nothing was known about Shirley K.’s future life. The suicide they had been asked to explain was only hypothetical. However, the web of explanation had been spun. When asked how likely it was that Shirley K. would in fact commit suicide, the volunteers rated this as being much more likely than did another group of people who had not been asked to explain the hypothetical suicide. In fact, even people told beforehand that the suicide didn’t actually happen nonetheless found their theories about why a suicide might have occurred so convincing that they, too, pegged Shirley K. as a high suicide risk.
A later study showed just how crucial these sorts of speculations are in helping to bolster a belief. In a variation of the experiment in which volunteers were given made-up information about their ability to tell the difference between genuine and fake suicide notes, volunteers were told (as in the original experiment) that their performance was either superior or inferior. As before, some of the volunteers were then left free to run wild with theories to explain their supposed level of social sensitivity. When later told that the feedback they had been given had been fabricated, they nonetheless continued to cling to their newfound belief about their social abilities (just as did the volunteers in the original experiment). The false feedback they had received was by then just a small part of the ‘evidence’ they had for their opinion regarding their social sensitivity. Something very different happened, however, with a second group of volunteers who were prevented from searching for explanations for their allegedly good or bad performance on the task. These volunteers were immediately commanded to keep themselves busy in an absorbing task. Denied the opportunity to rummage in their brains for other evidence to support their flimsy belief about their social sensitivity, they sensibly abandoned the belief as soon as they learnt that it was based on lies. It’s our irresistible urge to play amateur psychologist that makes us so vulnerable to our initial beliefs, no matter how bluntly the facts they were based on may be discredited. It’s human nature to try to explain everything that happens around us, perhaps as a way to make life seem less capricious.
Our susceptibility to first impressions is compounded by another, rather endearing, human failing. We are credulous creatures who find it easy to believe, but difficult to doubt. The problem is that we believe things to be true as a matter of course. As psychologist Daniel Gilbert has put it, ‘you can’t not believe everything you read’.18 Of course, we are not lumbered with our gullible beliefs forever, or even for very long. However, it is only with some mental effort that we can decide that they are untrue. Our natural urge – our default position – is to believe. This may be because, in general, people speak the truth more often than not. It’s therefore more efficient to assume that things are true unless we have reason to think otherwise.
But there is a problem with this system. If your brain is too busy with other things to put in the necessary legwork to reject a porkie pie, then you’re stuck with that belief. Advertisers and car salesmen will be delighted to learn that incredulity really is hard work for us, or so research suggests. If your brain is distracted or under pressure, you will tend to believe statements that you would normally find rather dubious.19 In fact, you may even find yourself believing things you were explicitly told were untrue. In one demonstration of this failure to ‘unbelieve’, volunteers read from a computer screen a series of statements about a criminal defendant (for example, ‘The robber had a gun’).20 Some of the statements were false. The volunteers knew exactly which ones they were, because they appeared in a different colour of text. For some of the volunteers, the untrue statements they were shown were designed to make the crime seem more heinous. For others, the false testimony made the crime seem more forgivable. At the same time that the volunteers were reading the statements, a string of digits also marched across the computer screen. Some of the volunteers had to push a button whenever they saw the digit ‘5’. Banal though this may seem, doing this uses up quite a lot of mental resources. This meant that these volunteers had less brainpower available to mentally switch the labelling of the false statements from the default ‘true’ to ‘false’. These busy volunteers were much more likely to misremember false statements as true. What’s more, this affected how long they thought the criminal should serve in prison. When the false statements unfairly exacerbated the severity of the crime, the distracted volunteers sentenced him to prison for almost twice as long a stretch.
Indeed, if your reputation is under examination, the gullible brains of others can put you in serious jeopardy. Because of our bias towards belief, we are particularly susceptible to innuendo. In a simulation of media election coverage, volunteers read a series of headlines about political candidates, and then gave their impressions of each of the politicians.21 Unsurprisingly, headlines such as ‘BOB TALBERT ASSOCIATED WITH FRAUDULENT CHARITY’ left Talbert’s reputation in tatters. Astonishingly, though, the headline ‘IS BOB TALBERT ASSOCIATED WITH FRAUDULENT CHARITY?’ was just as damaging. And if you’re thinking of entering the public eye yourself, consider this: even the headline ‘BOB TALBERT NOT LINKED WITH FRAUDULENT CHARITY’ was incriminating in the eyes of the readers. Denials are, after all, nothing more than statements with a ‘not’ tagged on. The bit about ‘Bob Talbert’ and ‘fraudulent charity’ slips into our brains easily enough, but the ‘not’ isn’t some how quite as effective as it should be in affecting our beliefs.22 We are suckers for innuendo, even – as the study went on to show – when it comes from a disreputable source like a tabloid. Though we all think ourselves immune to it, negative campaigning works.
For any defendant under scrutiny in the courtroom, the beliefs of gullible brains are, of course, of crucial significance. Remember the joke circulating prior to the O. J. Simpson trial?
Knock, knock.
Who’s there?
O.J.
O.J. who?
You’re on the jury.
Pre-trial publicity is usually very bad news for a defendant whose future liberty or even life depends on the machinations of twelve pigheaded brains.23 Perhaps because of our susceptibility to innuendo and even denials, media reports of crime encourage a pro-prosecution stance in jurors. It has been shown that the more people know about a case before the trial, the more guilty they think the defendant. And grisly media coverage aggravates the lock-him-up attitude ev
en further, even though the brutality of a crime obviously has no bearing whatsoever on whether that particular defendant is guilty. A juror who wallows in pre-trial publicity skews Justice’s scales against the defendant, and the pigheaded brain that then biases, distorts and even makes up evidence to support this belief in the defendant’s guilt certainly won’t help to restore the balance.
And it is not just jurors who should be on their guard. Prurient spectators, too, of high-publicity trials are persuaded into complacent self-assurance. Looking back on the trial from a post-verdict vantage point, the brain implacably refuses to concede that its predictive powers were ever anything less than perfect. ‘I knew it all along’, you tell yourself, surreptitiously adjusting memory. With the benefit of hindsight, what has happened seems inevitable and foreseeable – and you convince yourself that foresee it you did. Amid the scandal of the Bill Clinton impeachment trial, researchers interested in the hindsight-bias phenomenon asked people to estimate, at periods of both three weeks and three days before the much anticipated verdict, how likely it was that Clinton would be convicted.24 The media reports during this period made it seem increasingly likely that Clinton would be let off the hook, and the respondents’ speculations over that time as to his chances did change accordingly. No more than four days after the verdict, these people humbly and correctly remembered that their opinion had shifted over time towards the correct view that Clinton would be acquitted. But just a week after that, they were brashly claiming that they’d been pretty sure all along that Clinton wouldn’t be convicted. (They also believed that they, and they alone, enjoyed these powers of early prophecy. Asked about the speculations of the average American, or their best friend, they judged that these inferior beings were slower to read the runes than they themselves had been.)