Book Read Free

Humankind

Page 15

by Rutger Bregman


  But topping my list was Stanley Milgram. I know of no other study as cynical, as depressing and at the same time as famous as his experiments at the shock machine. By the time I’d completed a few months’ research, I reckoned I’d gathered enough ammunition to settle with his legacy. For starters, there are his personal archives, recently opened to the public. It turns out that they contain quite a bit of dirty laundry.

  ‘When I heard that archival material was available,’ Gina Perry told me during my visit to Melbourne, ‘I was eager to look behind the scenes.’ (This is the same Gina Perry who exposed the Robbers Cave Experiment as a fraud; see Chapter 7.) And so began what Perry called ‘a process of disillusionment’, culminating in a scathing book documenting her findings. What she uncovered had turned her from Milgram fan into fierce critic.

  Let’s first take a look at what Perry found. Again, it’s the story of a driven psychologist chasing prestige and acclaim. A man who misled and manipulated to get the results he wanted. A man who deliberately inflicted serious distress on trusting people who only wanted to help.

  2

  The date is 25 May 1962. The final three days of the experiment have begun. Nearly one thousand volunteers have had their turn at his shock machine, when Milgram realises something’s missing. Pictures.

  A hidden camera is hurriedly installed to record participants’ reactions. It’s during these sessions that Milgram finds his star subject, a man whose name would become synonymous with the banality of evil. Or, rather, his pseudonym: Fred Prozi. If you’ve ever seen footage of Milgram’s experiments, in one of hundreds of documentaries or in a clip on YouTube, then you’ve probably seen Prozi in action. And just like Zimbardo and prisoner 8612, it was the Fred Prozi recordings that made Milgram’s message hit home.

  We see a friendly-looking, heavyset man of around fifty who, with evident reluctance, does what he’s told. ‘But he might be dead in there!’ he cries in distress–and then presses the next switch.10 Watching the drama unfold, the viewer is hooked, both horrified and fascinated to see how far Prozi will go.

  It makes for sensational television, and Milgram knew it. ‘Brilliant,’ he called Prozi’s performance. He was thrilled with Prozi’s ‘complete abdication and excellent tension’, and determined to cast this as his movie’s leading character.11 If you’re thinking Milgram sounds more like a director than a scientist, you’re not far off, for it was as a director that he really shone.

  Anyone who deviated from his script was brought to heel by the application of intense pressure. The man in the grey lab coat–a biology teacher Milgram had hired named John Williams–would make as many as eight or nine attempts to get people to continue pressing higher switches. He even came to blows with one forty-six-year-old woman who turned the shock machine off. Williams turned it back on and demanded she continue.12

  ‘The slavish obedience to authority,’ writes Gina Perry, ‘comes to sound much more like bullying and coercion when you listen to these recordings.’13

  The key question is whether the experimental subjects believed they were administering real shocks at all. Shortly after the experiment, Milgram wrote that ‘with few exceptions subjects were convinced of the reality of the experimental situation’.14 Yet his archives are filled with statements from participants expressing doubt. Perhaps that’s not very surprising when you consider how bizarre this situation must have seemed. Were people seriously expected to believe that someone was being tortured and killed under the watchful eye of scientists from a prestigious institution like Yale?

  When the study was over, Milgram sent participants a questionnaire. One question was: how believable did you find the situation? Not until ten years later did he finally publish their answers, in the very last chapter of his book about the experiments. This is where we discover that only 56 per cent of his subjects believed they were actually inflicting pain on the learner. And that’s not all. A never-published analysis by one of Milgram’s assistants reveals that the majority of people called it quits if they did believe the shocks were real.15

  So if nearly half the participants thought the setup was fake, where does that leave Milgram’s research? Publicly, Milgram described his discoveries as revealing ‘profound and disturbing truths of human nature’. Privately, he had his doubts. ‘Whether all of this ballyhoo points to significant science or merely effective theater is an open question,’ he wrote in his personal journal in June 1962. ‘I am inclined to accept the latter interpretation.’16

  When he published his results in 1963, Milgram’s shock experiment met with abhorrence. ‘Open-eyed torture’, ‘vile’ and ‘in line with the human experiments of the Nazis’ were just a few ways the press characterised what he’d done.17 The public outcry led to new ethical guidelines for experimental research.

  All that time Milgram was keeping another secret. He chose not to inform some six hundred participants afterwards that the shocks in the experiment had not been real. Milgram was afraid the truth about his research would get out and he’d no longer be able to find test subjects. And so hundreds of people were left thinking they’d electrocuted another human being.

  ‘I actually checked the death notices in the New Haven Register for at least two weeks after the experiment,’ one said later, ‘to see if I had been involved and a contributing factor in the death of the so-called learner.’18

  3

  In the first version of this chapter, I left it at that. My conclusion was that, like Philip Zimbardo’s sadistic play-acting, Milgram’s research had been a farce.

  But in the months after meeting Gina Perry I was plagued by a nagging doubt. Could it be that I was just a little too keen to kick the shock machine to the curb? I thought back to Milgram’s poll among almost forty colleagues, asking them to forecast how many subjects would go up to the full 450 volts. Every single one had predicted that only people who were genuinely crazy or disturbed would press that final switch.

  One thing is certain: those experts were dead wrong. Even factoring in Milgram’s biased point of view, his bullying assistant and the scepticism among his volunteers, there were still too many people who bowed to authority. Too many ordinary people believed the shocks were real and still continued to press the highest switch. No matter how you look at it, Milgram’s results remain seriously disturbing.

  And it’s not only Milgram’s results. Psychologists the world over have replicated his shock experiment in various iterations, with minor modifications (such as a shorter duration) to satisfy university ethics boards. As much as there is to criticise about these studies, the uncomfortable fact is that, over and over again, the outcome is the same.

  Milgram’s research seems unassailable. Bulletproof. Like a zombie that refuses to die, it just keeps coming back. ‘People have tried to knock it down,’ says one American psychologist, ‘and it always comes up standing.’19 Evidently, ordinary human beings are capable of terrible cruelty towards one another.

  But why? Why does Homo puppy hit the 450-volt switch, if we’re hardwired to be kind?

  That’s the question I needed to answer.

  The first thing I wondered was whether Milgram’s obedience experiments really tested obedience at all. Take the script he wrote up for Williams–the ‘experimenter’ in the grey lab coat–which directed him to give defiant subjects four specific ‘prods’.

  First: ‘Please continue.’

  Next: ‘The experiment requires that you continue.’

  After that: ‘It is absolutely essential that you continue.’

  And only in the last place: ‘You have no other choice, you must go on.’

  Modern-day psychologists have pointed out that only this last line is an order. And when you listen to the tapes, it’s clear that as soon as Williams utters these words, everybody stops. The effect is instant disobedience. This was true in 1961, and it’s been true when Milgram’s experiment has been replicated since.20

  Painstaking analyses of the hundreds of sessions at Milgram’s
shock machine furthermore reveal that subjects grew more disobedient the more overbearing the man in the grey coat became. Put differently: Homo puppy did not brainlessly follow the authority’s orders. Turns out we have a downright aversion to bossy behaviour.

  So then how was Milgram able to induce his subjects to keep pressing the switches? Alex Haslam and Steve Reicher, the psychologists behind the BBC Prison Study (see Chapter 7), have come up with an intriguing theory. Rather than submitting to the grey-coated experimenter, the participants decided to join him. Why? Because they trusted him.

  Haslam and Reicher note that most people who volunteered for the study arrived feeling helpful. They wanted to help Mr Williams with his work. This would explain why the percentage of general goodwill declined when Milgram conducted the experiment in a plain office as opposed to the lofty setting of Yale. It could also explain why ‘prods’ invoking a scientific objective (like ‘The experiment requires that you continue’) were the most effective,21 and why the participants behaved not like mindless robots, but were racked with doubt.

  On the one hand, the teachers identified with the man in the grey lab coat, who kept repeating that the whole thing was in the interest of science. On the other, they couldn’t ignore the suffering of the learner in the other room. Participants repeatedly cried, ‘I can’t take this anymore’ and ‘I’m quitting’, even if they progressed to the next switch.

  One man said afterwards that he had persisted for his daughter, a six-year-old with cerebral palsy. He hoped that the medical world would one day find a cure: ‘I can only say that I was–look, I’m willing to do anything that’s ah, to help humanity, let’s put it that way.’22

  When Milgram subsequently told his subjects that their contribution would benefit science, many expressed relief. ‘I am happy to have been of service’ was a typical response, and, ‘Continue your experiments by all means as long as good can come of them. In this crazy mixed-up world of ours, every bit of goodness is needed.’23

  When psychologist Don Mixon repeated Milgram’s experiment in the seventies, he arrived at the same conclusion. He later noted, ‘In fact, people go to great lengths, will suffer great distress, to be good. People got caught up in trying to be good…’24

  In other words, if you push people hard enough, if you poke and prod, bait and manipulate, many of us are indeed capable of doing evil. The road to hell is paved with good intentions. But evil doesn’t live just beneath the surface; it takes immense effort to draw it out. And most importantly, evil has to be disguised as doing good.

  Ironically, good intentions also played a major role in the Stanford Prison Experiment, from Chapter 7. Student guard Dave Eshelman, who wondered if he would have taken things as far if he hadn’t been explicitly instructed to do so, also described himself as a ‘scientist at heart’.25 Afterwards, he said he felt he had done something positive, ‘because I had contributed in some way to the understanding of human nature’.26

  This was also true for David Jaffe, Zimbardo’s assistant who came up with the original prison study concept. Jaffe encouraged the well-meaning guards to take a tougher line by pointing to the noble intentions behind the study. ‘What we want to do,’ he told a wavering guard, ‘is be able to […] go to the world with what we’ve done and say “Now look, this is what happens when you have Guards who behave this way.” But in order to say that we have to have Guards who behave that way.’27

  Ultimately, David Jaffe and Philip Zimbardo wanted their work to galvanise a complete overhaul of the prison system. ‘Hopefully what will come out of this study is some very serious recommendations for reform,’ Jaffe assured the guard. ‘This is our goal. We’re not trying to do this just because we’re all, um, sadists.’28

  4

  That brings us back to Adolf Eichmann. On 11 April 1961, the Nazi officer’s trial for war crimes began. Over the next fourteen weeks, hundreds of witnesses took the stand. For fourteen weeks the prosecution did its best to show what a monster Eichmann was.

  But this was more than a court case alone. It was also a massive history lesson, a media spectacle to which millions of people tuned in. Among them was Stanley Milgram, described by his wife as a ‘news addict’, who closely followed the progress of the trial.29

  Hannah Arendt, meanwhile, had a seat in the courtroom. ‘The trouble with Eichmann,’ she later wrote, ‘was precisely that so many were like him, and that the many were neither perverted nor sadistic, that they were and still are, terribly and terrifyingly normal.’30 In the years that followed, Eichmann came to stand for the mindless ‘desk murderer’–for the banality of evil in each of us.

  Only recently have historians come to some very different conclusions. When the Israeli secret service captured Eichmann in 1960, he’d been hiding out in Argentina. There, he’d been interviewed by former Dutch SS officer Willem Sassen for several months. Sassen hoped to get Eichmann to admit that the Holocaust was all a lie fabricated to discredit the Nazi regime. He was disappointed.

  ‘I have no regrets!’ Eichmann assured him.31 Or as he’d already declared in 1945: ‘I will leap into my grave laughing because the feeling that I have five million human beings on my conscience is for me a source of extraordinary satisfaction.’32

  Reading through the thirteen hundred pages of interviews, teeming with warped ideas and fantasies, it’s patently obvious that Eichmann was no brainless bureaucrat. He was a fanatic. He acted not out of indifference, but out of conviction. Like Milgram’s experimental subjects, he did evil because he believed he was doing good.

  Although transcripts of the Sassen interviews were available at the time of the trial, Eichmann managed to cast doubt on their authenticity. And so he put the whole world on the wrong track. All that time, the interview tapes lay mouldering in the Bundesarchiv in Koblenz, where the philosopher Bettina Stangneth found them fifty years later. What she heard confirmed that everything in Sassen’s transcripts was true.

  ‘I never did anything, great or small, without obtaining in advance express instructions from Adolf Hitler or any of my superiors,’ Eichmann testified during the trial. This was a brazen lie. And his lie would be parroted by countless Nazis who professed that they were ‘just following orders’.

  Orders handed down within the Third Reich’s bureaucratic machine tended to be vague, historians have since come to realise. Official commands were rarely issued, so Hitler’s adherents had to rely on their own creativity. Rather than simply obeying their leader, historian Ian Kershaw explains that they ‘worked towards him’, attempting to act in the spirit of the Führer.33 This inspired a culture of one-upmanship in which increasingly radical Nazis devised increasingly radical measures to get in Hitler’s good graces.

  In other words, the Holocaust wasn’t the work of humans suddenly turned robots, just as Milgram’s volunteers didn’t press switches without stopping to think. The perpetrators believed they were on the right side of history. Auschwitz was the culmination of a long and complex historical process in which the voltage was upped step by step and evil was more convincingly passed off as good. The Nazi propaganda mill–with its writers and poets, its philosophers and politicians–had had years to do its work, blunting and poisoning the minds of the German people. Homo puppy was deceived and indoctrinated, brainwashed and manipulated.

  Only then could the inconceivable happen.

  Had Hannah Arendt been misled when she wrote that Eichmann wasn’t a monster? Had she been taken in by his act on the stand?

  That is the opinion of many historians, who cite her book as a case of ‘great idea, bad example’.34 But some philosophers disagree, arguing that these historians have failed to understand Arendt’s thinking. For Arendt did in fact study parts of Sassen’s interviews with Eichmann during the trial, and nowhere did she write that Eichmann was simply obeying orders.

  What’s more, Arendt was openly critical of Milgram’s obedience experiments. As much as the young psychologist admired the philosopher, the sentiment wasn’t mutu
al. Arendt accused Milgram of a ‘naïve belief that temptation and coercion are really the same thing’.35 And, unlike Milgram, she didn’t think a Nazi was hiding in each of us.

  Why did Milgram and Arendt enter the history books together? Some Arendt experts believe it’s because she was misinterpreted. She was one of those philosophers who spoke in aphorisms, using enigmatic phraseology that could easily be misunderstood. Take her statement that Eichmann ‘did not think’. She didn’t say he was a robotic desk killer, but, rather, as Arendt expert Roger Berkowitz points out, that Eichmann was unable to think from someone else’s perspective.36

  In point of fact, Hannah Arendt was one of those rare philosophers who believe that most people, deep down, are decent.37 She argued that our need for love and friendship is more human than any inclination towards hate and violence. And when we do choose the path of evil, we feel compelled to hide behind lies and clichés that give us a semblance of virtue.

  Eichmann was a prime example. He’d convinced himself he’d done a great deed, something historic for which he’d be admired by future generations. That didn’t make him a monster or a robot. It made him a joiner. Many years later, psychologists would reach the same conclusion about Milgram’s research: the shock experiments were not about obedience. They were about conformity.

  It’s astonishing how far ahead of her time Hannah Arendt was when she made precisely the same observation.

  Sadly, Stanley Milgram’s simplistic deductions (that humans submit to evil without thinking) made a more lasting impression than Hannah Arendt’s layered philosophy (that humans are tempted by evil masquerading as good). This speaks to Milgram’s directorial talent, to his eye for drama and his astute sense of what works on television.

  But above all, I think what made Milgram famous was that he furnished evidence to support an age-old belief. ‘The experiments seemed to offer strong support,’ writes psychologist Don Mixon, ‘for history’s oldest, most momentous self-fulfilling prophecy–that we are born sinners. Most people, even atheists, believe that it is good for us to be reminded of our sinful nature.’38

 

‹ Prev