During the 1950s and 1960s, social psychologists began prying, probing and prodding to pin down what turns ordinary men and women into monsters. This new breed of scientist devised one experiment after another that showed humans are capable of appalling acts. A tweak in our situation is all it takes and–voila!–out comes the Nazi in each of us.
In the years that Lord of the Flies topped the bestseller lists, a young researcher named Stanley Milgram demonstrated how obediently people follow the orders even of dubious authority figures (Chapter 8), while the murder of a young woman in New York City laid the basis for hundreds of studies on apathy in the modern age (Chapter 9). And then there were the experiments by psychology professors Muzafer Sherif and Philip Zimbardo (Chapter 7), who demonstrated that good little boys can turn into camp tyrants at the drop of a hat.
What fascinates me is that all of these studies took place during a relatively short span of time. These were the wild west years of social psychology, when young hotshot researchers could soar to scientific stardom on the wings of shocking experiments.
Fifty years on, the young hotshots are dead and gone or travelling the globe as renowned professors. Their work is famous and continues to be taught to new generations of students. But now the archives of their post-war experiments have also been opened. For the first time, we can take a look behind the scenes.
7
In the Basement of Stanford University
1
It’s 15 August 1971. Shortly before ten in the morning on the West Coast, Palo Alto police arrive in force to pull nine young men out of their beds. Five are booked for theft, four for armed robbery. Neighbours look on in surprise as the men are frisked, handcuffed and whisked away in the waiting police cars.
What the bystanders don’t realise is that this is part of an experiment. An experiment that will go down in history as one of the most notorious scientific studies ever. An experiment that will make front-page news and become textbook fare for millions of college freshmen.
That same afternoon, the alleged criminals–in reality, innocent college students–descend the stone steps of Building 420 to the basement of the university’s psychology department. A sign welcomes them to THE STANFORD COUNTY JAIL. At the bottom of the stairs waits another group of nine students, all dressed in uniforms, their eyes masked by mirrored sunglasses. Like the students in handcuffs, they’re here to earn some extra cash. But these students won’t be playing prisoner. They’ve been assigned the role of guard.
The prisoners are ordered to strip and are then lined up naked in the hallway. Chains are clapped around their ankles, nylon caps pulled down over their hair and each one gets a number by which he’ll be addressed from this point on. Finally, they’re given a smock to wear and locked behind bars, three to a cell.
What happens next will send shockwaves around the world. In a matter of days, the Stanford Prison Experiment spins out of control–and in the process reveals some grim truths about human nature.
The basement of Stanford University, August 1971.
Source: Philip G. Zimbardo.
It started with a group of ordinary, healthy young men. Several of them, when signing on for the study, called themselves pacifists.
By the second day things had already begun to unravel. A rebellion among the inmates was countered with fire extinguishers by the guards, and in the days that followed the guards devised all kinds of tactics to break their subordinates. In cells reeking of human faeces, the prisoners succumbed one by one to the effects of sleep deprivation and debasement, while the guards revelled in their power.
One inmate, prisoner 8612, went ballistic. Kicking his cell door, he screamed: ‘I mean, Jesus Christ, I’m burning up inside! Don’t you know? I want to get out! This is all fucked up inside! I can’t stand another night! I just can’t take it anymore!’1
The study’s lead investigator, psychologist Philip Zimbardo, also got swept up in the drama. He played a prison superintendent determined to run a tight ship at any cost. Not until six days into the experiment did he finally call an end to the nightmare, after a horrified postgrad–his girlfriend–asked him what the hell he was doing. By then, five of the prisoners were exhibiting signs of ‘extreme emotional depression, crying, rage and acute anxiety’.2
In the aftermath of the experiment, Zimbardo and his team were faced with a painful question: what happened? Today you can find the answer in just about any introductory psychology textbook. And in Hollywood blockbusters, Netflix documentaries and mega-bestsellers like Malcolm Gladwell’s The Tipping Point. Or else swing by the office watercooler, where someone can probably fill you in.
The answer goes something like this. On 15 August 1971, a group of ordinary students morphed into monsters. Not because they were bad people, but because they’d been put in a bad situation. ‘You can take normal people from good schools and happy families and good neighborhoods and powerfully affect their behavior,’ Gladwell tells us, ‘merely by changing the immediate details of their situation.’3
Philip Zimbardo would later swear up and down that nobody could have suspected his experiment would get so out of hand. Afterwards, he had to conclude that we’re all capable of the most heinous acts. What happened in the basement of Stanford University had to be understood, he wrote, ‘as a “natural” consequence of being in the uniform of a “guard”’.4
2
Few people know that, seventeen years earlier, another experiment was conducted which came to much the same conclusion. Largely forgotten outside academia, the Robbers Cave Experiment would inspire social psychologists for decades. And unlike the Stanford study, its subjects were not student volunteers, but unsuspecting children.
It’s 19 June 1954. Twelve boys, all around eleven, are waiting at a bus stop in Oklahoma City. None of them know each other, but they’re all from upstanding, churchgoing families. Their IQs are average, as are their grades at school. None are known troublemakers or get bullied. They’re all well-adjusted, ordinary kids.
On this particular day, they’re excited kids. That’s because they’re on their way to summer camp at Robbers Cave State Park in south-east Oklahoma. Famous as the one-time hideout of legendary outlaws like Belle Starr and Jesse James, the camp covers some two hundred acres of forest, lakes and caves. What the boys don’t realise is that they’ll be sharing this paradise with another group of campers that arrives the next day. And what they also don’t know: this is a scientific experiment. The campers are the guinea pigs.
The study is in the hands of Turkish psychologist Muzafer Sherif, who has long been interested in how conflicts between groups arise. His preparations for the camp have been meticulous and his instructions for the research team are clear: the boys are to be free to do whatever they please, no holds barred.
In the first phase of the study, neither group of boys will be aware of the other’s existence. They’ll stay in separate buildings and assume they’re alone in the park. Then, in the second week, they’ll be brought into careful contact. What will happen? Will they become friends, or will all hell break loose?
The Robbers Cave Experiment is a story about well-behaved little boys–‘the cream of the crop,’ as Sherif later described them–who in the space of a few days degenerate into ‘wicked, disturbed, and vicious bunches of youngsters’.5 Sherif’s camp took place in the same year that William Golding published his Lord of the Flies, but while Golding thought kids are bad by nature, Sherif believed everything hinges on context.
Things start out pleasantly enough. During the first week, when the two groups are still oblivious to one another’s existence, the boys in each camp work together in perfect accord. They build a rope bridge and a diving board. They grill hamburgers and pitch tents. They run and play and they all become fast friends.
The next week, the experiment takes a turn. The two groups, having christened themselves the ‘Rattlers’ and the ‘Eagles’, are cautiously introduced to one another. When the Rattlers hear the Eagles playing o
n ‘their’ baseball field and challenge their counterparts to a game, it touches off a week of rivalry and competition. From there on out, things escalate quickly. On day two the Eagles burn the Rattlers’ flag after losing at tug-of-war. The Rattlers retaliate with a midnight raid where they tear up curtains and loot comic books. The Eagles decide to settle the score by stuffing their socks with heavy rocks to use as weapons. In the nick of time, the camp staff manage to intervene.
At the end of the week’s tournament, the Eagles are declared the victors and get the coveted prize of shiny pocketknives. The Rattlers take revenge by mounting another raid and making off with all the prize booty. When confronted by the furious Eagles, the Rattlers only jeer. ‘Come on, you yellow bellies,’ taunts one of them, brandishing the knives.6
As the boys begin duking it out, Dr Sherif, posing as the camp caretaker, sits off to one side, busily scribbling his notes. He could tell already: this experiment was going to be a goldmine.
The story of the Robbers Cave Experiment has made a comeback in recent years, especially since Donald Trump was elected president of the United States. I can’t tell you how many pundits have pointed to this study as the anecdotal key to understanding our times. Aren’t the Rattlers and the Eagles a symbol for the ubiquitous clashes between left and right, conservative and progressive?
Television producers looked at the study’s premise and saw a hit. In Holland, they attempted a thinly veiled remake aptly titled ‘This Means War’. But shooting had to be terminated prematurely when it turned out the concept really did mean war.
Reasons enough to crack open Muzafer Sherif’s original 1961 research report. Having read it, I can assure you: a page-turner it is not. On one of the first pages, Sherif tells us, ‘Negative attitudes towards outgroups will be generated situationally.’ Read: this means war.
But in among all the academic abstraction I found some interesting facts. For starters, it wasn’t the kids themselves, but the experimenters who decided to hold a week of competitions. The Eagles weren’t keen on the idea. ‘Maybe we could make friends with those guys,’ one boy suggested, ‘and then somebody wouldn’t get mad and have any grudges.’7
And at the researchers’ insistence, the groups only played games that had clear-cut winners and losers, like baseball and tug-of-war. There were no consolation prizes, and the researchers manipulated scores to ensure the teams would stay in a neck-and-neck race.
Turns out these machinations were only the beginning.
3
I meet Gina Perry in Melbourne in the summer of 2017, just months before the publication of her book on the Robbers Cave Experiment. Perry is an Australian psychologist and was the first person to delve into the archives of Sherif’s experiment. As she dug through reams of notes and recordings, she uncovered a story that contradicts everything the textbooks have been repeating for the past fifty years.
To begin with, Perry discovered that Sherif had tried to test his ‘realistic conflict theory’ before. He’d orchestrated another summer camp in 1953 outside the small town of Middle Grove in New York State. And there, too, he’d done his best to pit the boys against one another. The only thing Sherif was willing to say about it afterwards–tucked away in a footnote–was that the experiment had to be suspended ‘due to various difficulties and unfavorable conditions’.8
In Melbourne, Perry tells me what she learned from the archives about what actually happened at that other, forgotten summer camp. Two days after their arrival, the boys had all become friends. They played games and ran wild in the woods, shot with bows and arrows and sang at the top of their lungs.
When day three rolled around, the experimenters split them up into two groups–the Panthers and the Pythons–and for the rest of the week they deployed every trick in the book to turn the two teams against each other. When the Panthers wanted to design team T-shirts that featured the olive branch of peace, the staff put a stop to it. A few days later, one of the experimenters tore down a Python tent, expecting the Panthers would take the heat for it. He looked on in frustration as the groups worked together to put the tent back up.
Next, the staff secretly raided the Panther camp, hoping the Pythons would get blamed. Once more, the boys helped each other out. One boy whose ukulele had been broken even called out the staff members and demanded an alibi. ‘Maybe,’ he accused, ‘you just wanted to see what our reactions would be.’9
The mood within the research team soured as the week progressed. Their pricey experiment was on course to crash and burn. The boys weren’t fighting like Sherif’s ‘realistic conflict theory’ said they would, but instead remained the best of friends. Sherif blamed everyone but himself. He stayed up until two in the morning–pacing, as Perry could hear in the study’s audio recordings–and drinking.
It was on one of the last evenings that tensions boiled over. While the campers lay peacefully asleep, Sherif threatened to punch a research assistant for not doing his best to sow discord among the children. The assistant grabbed a block of wood in self-defence. ‘Dr. Sherif!’ his voice echoed through the night, ‘If you do it, I’m gonna hit you.’10
The children would eventually realise they were being manipulated, after one boy discovered a notebook containing detailed observations. After that, there was no choice but to call the experiment off. If anything had been proved, it was that once kids become friends it’s very hard to turn them against each other. ‘They misunderstood human nature,’ said one participant about the psychologists, years later. ‘They certainly misunderstood children.’11
4
If you think Dr Muzafer Sherif’s manipulations are outrageous, they pale in comparison to the scenario cooked up seventeen years later. On the face of it, the Stanford Prison Experiment and the Robbers Cave Experiment have a lot in common. Both had twenty-four white, male subjects, and both were designed to prove that nice people can spontaneously turn evil.12 But the Stanford Prison Experiment went a step further.
Philip Zimbardo’s study wasn’t just dubious. It was a hoax.
My own doubts surfaced on reading Zimbardo’s book The Lucifer Effect, published in 2007. I had always assumed his prison ‘guards’ turned sadistic of their own accord. Zimbardo himself had claimed exactly that hundreds of times, in countless interviews, and in a hearing before the US Congress even testified that the guards ‘made up their own rules for maintaining law, order, and respect’.13
But then, on here of his book, Zimbardo suddenly mentions a meeting with the guards that took place on the Saturday preceding the experiment. That afternoon he briefed the guards on their role. There could be no mistaking his instructions:
We can create a sense of frustration. We can create fear in them […] We’re going to take away their individuality in various ways. They’re going to be wearing uniforms, and at no time will anybody call them by name; they will have numbers and be called only by their numbers. In general, what all this should create in them is a sense of powerlessness.14
When I came to this passage, I was stunned. Here was the supposedly independent scientist stating outright that he had drilled his guards. They hadn’t come up with the idea to address the prisoners by numbers, or to wear sunglasses, or play sadistic games. It’s what they were told to do.
Not only that, on the Saturday before the experiment started, Zimbardo was already talking about ‘we’ and ‘they’ as though he and the guards were on the same team. Which meant that the story he later told about losing himself in the role of prison superintendent as the experiment progressed couldn’t be true. Zimbardo had been calling the shots from day one.
To grasp how fatal this is for objective research, it’s important to know about what social scientists call demand characteristics. These are behaviours that subjects exhibit if they’re able to guess at the aim of a study, thus turning a scientific experiment into a staged production. And in the Stanford Prison Experiment, as one research psychologist put it, ‘the demands were everywhere’.15
What, t
hen, did the guards themselves believe was expected of them? That they could sit around, maybe play some cards, and gossip about sports and girls? In a later interview, one student said he’d mapped out beforehand what he was going to do: ‘I set out with a definite plan in mind, to try to force the action, force something to happen, so that the researchers would have something to work with. After all, what could they possibly learn from guys sitting around like it was a country club?’16
That the Stanford Prison Experiment hasn’t been scrapped from the textbooks after confessions like this is bad enough. But it gets worse. In June 2013, French sociologist Thibault Le Texier stumbled across a TED Talk Zimbardo gave in 2009. As a part-time filmmaker, his attention was immediately caught by the images Zimbardo showed on screen. The raw footage of screaming students looked, to Le Texier’s practised eye, like perfect material for a gripping documentary. So he decided to do some research.
Le Texier secured a grant from a French film fund and booked a flight to California. At Stanford he made two shocking discoveries. One was that he was the first to consult Zimbardo’s archives. The other was what those archives contained. Le Texier’s enthusiasm swiftly gave way to confusion and then to dismay: like Gina Perry, he found himself surrounded by piles of documents and recordings that presented what amounted to a whole different experiment.
‘It took quite a while before I accepted the idea that it could all be fake,’ Le Texier told me in the autumn of 2018, a year before his scathing analysis appeared in the world’s leading academic psychology journal, American Psychologist. ‘At first, I didn’t want to believe it. I thought: no, this is a reputable professor at Stanford University. I must be wrong.’
Humankind Page 13