A Mind of Its Own

Home > Other > A Mind of Its Own > Page 8
A Mind of Its Own Page 8

by Cordelia Fine


  The dreadful insight into our moral frailty that this research offers us should be edifying. And yet – what do you know – our immoral brains have a way of convincing us that the regrettable moral deficiencies of others have few lessons to teach us with regard to our own saintly dispositions. Researchers described Milgram’s electric-shock experiment to psychology students and then asked them to look into their souls and speculate on what they would have done in the same situation.16 Some of the students were already rather knowledgeable about Milgram’s legendary research. If they were to learn anything from his work, it was that it’s not so much the kind of person you are as the pressures of the situation in which you find yourself that will determine how you behave. Yet their education failed to bring them self-enlightenment. They confidently predicted that they would defy the experimenter far earlier than would a typical volunteer.17 Indeed, although they were also all well versed in the self-conceits to which we are susceptible, their self-portraiture, as they imagined themselves as one of Milgram’s teachers, was no less flattering than that of students unschooled in both Milgram’s findings and the brain’s narcissism.

  The problem is that you may know, intellectually, that people’s moral stamina is but a leaf blown hither and thither by the winds of circumstance. You may be (and indeed now are) comprehensively informed about the self-enhancing distortions of the human brain. Yet this knowledge is almost impossible to apply to oneself. Somehow, it fails dismally to penetrate the self-image. Can you imagine yourself delivering extreme and intensely painful electric shocks to a protesting fellow human being? Of course not. I doubt if anyone reading this book can picture themselves behaving in this way. But the fact is, if you had been one of Milgram’s many unsuspecting teachers, you almost certainly would have behaved just like everyone else. (Go on, admit it. Even now, you’re thinking that you’d have been one of the rare people who defied the experimenter.)

  Unfortunately, our refusal to acknowledge the truth in the homily ‘There but for the grace of God go I’ does more damage than simply keeping us peacefully smug about our own moral superiority. When we ignore the power of circumstances to overwhelm personality, we wind up misguidedly looking at a person’s character in order to explain their failure to uphold an ideally high standard of conduct (the one that we ourselves would doubtless maintain).18 And we persist in this habit, known as the correspondence bias, even when we should know better. Students shown the film of Milgram’s experiments, Obedience, made this very mistake.19 Instead of acknowledging the unexpected power of the experimenter’s authority, they fell back on their old bad habit of presuming that a person’s behaviour offers unconditional insight into his inner nature. The students watching the film inferred that there were dark, sadistic shadows lurking in the souls of Milgram’s participants. This became clear in the second part of the experiment, when they were told about a variation of Milgram’s research in which teachers were free to set the shock generator at whatever level they wanted. Asked to guess what level of shock teachers would use in this version of the experiment, they hugely overestimated the intensity of shocks that Milgram’s participants actually delivered. By pointing the finger of blame at the person, rather than the situation, they unfairly pegged the participants as ‘wolves rather than sheep’, as the researchers put it.

  Nor can we console ourselves by supposing that, in settings more familiar to us than a macabre psychology experiment, we do a better job of sizing up the balance of scruples and situations. Students lectured at length about the findings of the Good Samaritan experiment remained insistent that someone who scuttled past the groaning man must be particularly black of heart, rather than merely a pawn of his pressing affairs. Asked to predict how he would behave if he had time to spare, the students anticipated – wrongly – that even then he would callously disregard the victim.20 They were no more sensitive to how we are all influenced by our current situation than were other students who knew little about the original experiment.

  Our thoughtless dismissal of how the unobtrusive pressures of the scene around us can mould behaviour may cause us to depreciate others in yet another way. It puts us at risk of overlooking the impressive strength of character displayed by those rare people who do indeed manage to break free of the constraints set by their particular situations and circumstances. Thinking that any decent person (ourselves included) would have done the same, we may be heedless of the moral fibre shown by the few people who defied the commands of Milgram’s experimenter to continue shocking the learner, or who, hard-pushed for time, nonetheless stopped to help someone in need.21

  The masterful hypocrisy of the immoral brain demands a certain grudging respect. It lazily applies nothing but the most superficial and disapproving analysis of others’ misdemeanours, while bending over backwards to reassure that you can do no wrong. Of course there is always potential for embarrassment whenever we deviate (as we inevitably do) from the impeccable ethical standards we believe ourselves to live by. As we have already seen, the brain can sometimes deal with this awkwardness by adeptly supplying excuses to explain away the unrepresentative flaws in your conduct.

  But what if there are no obvious mitigating circumstances to call upon? With a little mental shuffling, there are other ways to rebalance the ledger. When we find ourselves behaving in a manner that is inconsistent with our moral code, rather than acknowledging our duplicity we can craftily adapt our beliefs to make the behaviour itself seem satisfactory after all. In the classic demonstration of these underhand accounting practices at work, volunteers spent a tedious hour emptying and refilling trays of spools, and twisting pegs quarter-turns on a board.22 Over and over again. When the hour was finally up, the experimenter made it seem as though the study was over (although, in fact, it had hardly begun). Pushing back his chair and lighting up a cigarette, he explained that there were actually two separate groups taking part in this experiment. One half were being told beforehand, by his accomplice, that the tasks they were about to perform were interesting, intriguing and exciting. The other group (to which the participant supposedly belonged) received no such introduction. (According to the cover story, the researchers were interested in how the effusive claims made beforehand affected performance.) Feigning some embarrassment, the experimenter then asked if the participant, who’d just staggered to the end of his hour of mind-cracking tedium, would mind taking the place of the accomplice, who had failed to show up on time. All he had to do, said the experimenter, was to tell the next participant how much fun he’d just been having with the spools and the pegs. Some of the participants were offered one dollar, others twenty dollars, to tell these lies.

  Almost all agreed to collude in the experimental deception. For those offered twenty dollars (a hefty sum in the 1950s when this study was done), it made perfect sense to tell an inconsequential lie for such a generous reward. Who wouldn’t choose to do the same, or forgive it in another? But the participants who’d been offered only one dollar couldn’t explain their behaviour in the same way. Failing to realise the subtle pressure on them to comply with the experimenter’s request, they were placed in a rather uncomfortable position. On the one hand, they had just spent a dreary hour of their precious life performing stupefyingly boring tasks; on the other hand they had, for no apparent good reason, just told the next participant to hold onto her hat as the thrills began. Were they really the sort of person who would lie for a dollar? Of course not. And yet, uncomfortably inconsistent with this conviction in their good and honest character, was their awareness of the fib they had just told. In order to deal with this cognitive dissonance, as it is known, the men surreptitiously adjusted how they felt about the experiment. Asked after this part of the experiment to say, in all honesty, how interesting and enjoyable they had found it, those clutching their one paltry dollar claimed to have had a much better time than those with a roll of twenty bulging in their back pocket.

  There is one final strategy available to the immoral brain as it goes about i
ts important business of nipping in the bud, swiftly and efficiently, any moral misgivings we might otherwise experience. We can persuade ourselves that, really, there is no ethical dimension at all to the situation in which we find ourselves. This way, if there’s no moral duty to be done, why should we feel bad about doing nothing? How was it that none of the 38 witnesses to a fatal stabbing of a young woman in Queens, New York, intervened or called the police? Because ‘we thought it was a lovers’ quarrel’, said one woman. ‘I went back to bed.’23 And in covert laboratory set-ups designed to give unsuspecting participants the opportunity to showcase their social conscience, the excuses given by the many who remain apathetically idle are even more remarkable.24 People who fail to report smoke billowing into a room suggest that it is simply smog or steam. People who don’t help a woman who has just fallen off a ladder claim that she hadn’t actually fallen, or wasn’t really injured. Did the hurried participants in the Good Samaritan study convince themselves that the coughing, groaning wretch in the alleyway didn’t really need help? Quite possibly. It’s just more comfortable that way.

  As any parent knows, it is a long haul from the joyful lawlessness of toddlerhood to the moral maturity of adulthood. Currently, one of my son’s favourite misdeeds is to roll his baby brother from his tummy onto his back. Normally a sweet and affectionate older brother, occasionally, when parental eyes are diverted, he gives in to temptation and trundles the baby over on the playmat. What he does next starkly exposes his childish lack of understanding of grown-up notions of right and wrong. He does not pretend that the baby deserved it, nor blame the baby for being so seductively rotund. He does not excuse himself by calling attention to his tender age. He makes no claim that other, less pliant toddlers would flip the baby over much more frequently. Nor does he even appear to consider suggesting that the baby’s tears stem from joy, rather than shocked bewilderment at finding himself unexpectedly staring at the ceiling. Instead, he does something that no self-respecting adult brain would ever permit.

  He chastens himself with a cry of ‘Naughty Isaac!’ and, with genuine humility, places himself in the naughty corner.

  He has much to learn.

  Notes

  1 J. Haidt (2001), ‘The emotional dog and its rational tail: a social intuitionist approach to moral judgment’, Psychological Review, 108: 814–34.

  2 J. Haidt and M.A. Hersh, M.A. (2001), ‘Sexual morality: the cultures and emotions of conservatives and liberals’, Journal of Applied Social Psychology, 31: 191–221.

  3 J.S. Lerner, J.H. Goldberg and P. E. Tetlock (1998), ‘Sober second thought: the effects of accountability, anger, and authoritarianism on attributions of responsibility’, Personality and Social Psychology Bulletin, 24: 563–74.

  4 M. J. Lerner (1980), The belief in a just world: a fundamental delusion, New York and London: Plenum Press.

  5 See M.J. Lerner (1980), ibid; Also L. Montada and M.J. Lerner (1998), Responses to victimizations and belief in a just world. New York and London: Plenum Press.

  6 See, for example, G. Younge (2005), ‘Murder and rape – fact or fiction?’, The Guardian, 6 September. Retrieved on 24 October 2005 from: http://www.guardian.co.uk/katrina/story/0,16441,1563532,00.html

  7 R. Buehler, D. Griffin and M. Ross (1994), ‘Exploring the “Planning Fallacy”: why people underestimate their task completion times’, Journal of Personality and Social Psychology, 67: 366–81.

  8 F.D. Fincham, S.R. Beach and D.H. Baucom (1987), ‘Attribution processes in distressed and nondistressed couples: 4. Self-partner attribution differences’, Journal of Personality and Social Psychology, 52: 739–48.

  9 A. Schütz (1999), ‘It was your fault! Self-serving biases in autobiographical accounts of conflicts in married couples’, Journal of Social and Personal Relationships, 16: 193–208.

  10 F.D. Fincham, S.R. Beach and D.H. Baucom (1987), ‘Attribution processes in distressed and nondistressed couples: 4. Self-partner attribution differences’, Journal of Personality and Social Psychology, 52: 739–48.

  11 A. Schütz (1999), ‘It was your fault! Self-serving biases in autobiographical accounts of conflicts in married couples’, Journal of Social and Personal Relationships, 16: 193–208.

  12 J. Kruger and T. Gilovich (2004), ‘Actions, intentions, and self-assessment: the road to self-enhancement is paved with good intentions’, Personality and Social Psychology Bulletin, 30: 328–39.

  13 S. Milgram (1963), ‘Behavioral study of obedience’, Journal of Abnormal and Social Psychology, 67: 371–8.

  14 E. Tarnow (2000), ‘Self-destructive obedience in the airplane cockpit and the concept of obedience optimization’, in T. Blass (ed), Obedience to authority: Current perspectives on the Milgram paradigm, Mahway, NJ: Lawrence Erlbaum Associates (pp. 111–23)

  15 J.M. Darley and C.D. Batson (1973), ‘“From Jerusalem to Jericho”: A study of situational and dipositional variables in helping behavior’, Journal of Personality and Social Psychology, 27:100–19.

  16 G. Geher, K.P. Bauman, S. E.K. Hubbard and J.R. Legare (2002), ‘Self and other obedience estimates: biases and moderators’, Journal of Social Psychology, 142: 677–89.

  17 On average, they saw themselves delivering nothing more powerful than about 140 volts, at the low end of the ‘Strong Shock’ category on Milgram’s shock generator control panel.

  18 See D.T. Gilbert and P. S. Malone (1995), ‘The correspondence bias’, Psychological Bulletin, 117: 21–38. The correspondence bias is also known as the fundamental attribution error.

  19 M.A. Safer (1980), ‘Attributing evil to the subject, not the situation: Student reaction to Milgram’s film on obedience’, Personality and Social Psychology Bulletin, 6: 205–09.

  20 P. Pietromonaco and R. E. Nisbett (1982), ‘Swimming upstream against the fundamental attribution error: Subjects’ weak generalizations from the Darley and Batson study’, Social Behavior and Personality, 10: 1–4.

  21 A suggestion made by D.T. Gilbert and P. S. Malone (1995), ‘The correspondence bias’, Psychological Bulletin, 117: 21–38.

  22 L. Festinger and J.M. Carlsmith (1959), ‘Cognitive consequences of forced compliance’, Journal of Abnormal and Social Psychology, 58: 203–10.

  23 A.M. Rosenthal (1999), Thirty-eight witnesses: the Kitty Genovese case, Berkeley, CA: University of California Press.

  24 See C.R. Snyder (1985), ‘Collaborative companions: the relationship of self-deception and excuse making’, in M.W. Martin (ed), Self-deception and self-understanding: new essays in philosophy and psychology, Lawrence: University of Kansas Press (pp. 35–51).

  CHAPTER 4

  The Deluded Brain

  A slapdash approach to the truth

  When learned psychiatrists gathered together to brainstorm their way to an official description of delusions, they had a terrible time trying to come up with a definition that didn’t make a large proportion of the population instantly eligible for psychiatric services.1 One can imagine the increasingly frustrated attempts to position the line appropriately between sanity and madness. Dr Smith might kick off with her own pet definition of delusion.

  ‘I’m going to suggest a false belief.’

  We can envisage Dr Brown instantly reeking of sarcasm.

  ‘Wonderful! Now I shall be able to cure all of my paranoid patients instantly, simply by arranging for them to be properly persecuted.’

  Cunningly, Dr Smith adjusts her original definition to suit.

  ‘And, as I was just about to add, this must be a false belief held despite evidence to the contrary.’

  Dr Brown’s scorn continues unabated.

  ‘Oh, I see. Such as, for example, your tenaciously held views on the beneficial effects of psychoanalysis for manic-depression!’

  Dr Smith hits back in the most ferocious fashion available to academics.

  ‘Well, if it’s that recent article of yours in the Journal of Psychiatry you think should have changed my mind, I hardly consider that a convincing source of contrary evid
ence!’

  At this point one imagines a third party, let us call her Dr Jones, stepping in smoothly.

  ‘Doctors, please! What about a false belief held despite incontrovertible and obvious evidence to the contrary?’

  And so on and so forth until the coffee break.

  The trouble is, the question of evidence doesn’t help much. No one can prove to a psychotic patient that the devil isn’t in fact transmitting thoughts into his head, any more than they can prove wrong the 150 million Americans who think it possible for someone to be physically possessed by the devil. (Or, before entire nations start scoffing, the 25 million Britons who believe in communication with the dead.) But we can’t allow everyone with a common or garden belief in the paranormal to be defined into madness – there simply aren’t enough psychiatrists to cope. And perhaps that’s why the definition of delusion has, tacked onto it, the proviso that it must be a belief that almost no one else holds. So let us, like the little green men who swoop down in flying saucers to take a closer look at us, probe a little deeper …

  Our beliefs range from the run-of-the-mill to the strikingly bizarre, and many at each end of the spectrum embrace their own share of deviance from reality. Our first problem is that we are, at root, very poor scientists. All sorts of biases slip in unnoticed as we form and test our beliefs, and these tendencies lead us astray to a surprising degree. Of course, an ignoble agenda – the desire to see evidence for a belief we’d secretly prefer to hold – can wreak its prejudicing influence on our opinions. However, even when we genuinely seek the truth, our careless data collection and appraisal can leave us in woeful error about ourselves, other people and the world. Then consider our susceptibility to strange experiences. After all, hallucinations, déjà vu, premonitions, depersonalisation and religious experiences are not uncommon in the general population.2 Add these to our innate lack of scientific rigour and you have a perilous combination. And, it’s not yet clear exactly what it is that saves most of us from crossing the shadowy line that separates everyday delusions from the clinical variety.

 

‹ Prev