Black Box Thinking

Home > Other > Black Box Thinking > Page 31
Black Box Thinking Page 31

by Matthew Syed


  To me, it didn’t make sense. Why jeopardize three years of work for the sake of a night on the town? What could they possibly hope to gain by arriving at the first exam, one of the most important days of their lives, with a hangover? The most surprising thing of all was that many were among the brightest students, who had worked diligently for the preceding three years.

  It was only years later, when reading about cognitive dissonance and the Fixed Mindset, that the pieces fell into place: they were so terrified of underperforming, so worried that the exam might reveal that they were not very clever, that they needed an alternative explanation for possible failure. They effectively sabotaged their own chances in order to gain one.

  Excuses in life are typically created retrospectively. We have all pointed to a bad night’s sleep, or a cold, or the dog being sick, to justify a poor performance. But these excuses are so obvious and self-serving that people see through them. We see through our own excuses too. They don’t reduce dissonance because they are too blatant.

  But self-handicapping is more sophisticated. This is where the excuse is not cobbled together after the event, but actively engineered beforehand. It is, in effect, a preemptive dissonance-reducing strategy. If these students flunked their crucial exam, they could say: “It wasn’t me who messed up, it was the booze!” It served another purpose, too: if they did pass the exam, they could still point to alcohol in mitigation for why they didn’t get an even higher grade.

  The phenomenon of self-handicapping seems, on the surface, perplexing: young athletes who stop training hard in the crucial few weeks before a big event; executives who breeze into a vital sales pitch without reading the relevant material; brilliant university students who suddenly decide to get drunk before a crucial exam.

  But viewed through the prism of the Fixed Mindset it makes perfect sense. It is precisely because the project really matters that failure is so threatening—and why they desperately need an alternative explanation for messing up. As one psychologist put it: “One can admit to a minor flaw [drinking] in order to avoid admitting to a much more threatening one [I am not as bright as I like to think].”13

  In a seminal 1978 study into self-handicapping by psychologists Steven Berglas and Edward Jones, students were given an exam.14 Before taking the exam students were asked whether they would like to take a drug that would inhibit their performance. This wasn’t really a choice at all. After all, why would anyone wish to actively undermine their chances of success? But, as it turned out, a large proportion chose to take it.

  To some observers it seemed crazy, but to Dr. Berglas it made perfect sense. He had himself experimented with drugs for the first time just before he took the crucial SAT examinations in high school. He was expected to get a perfect score. His self-image was bound up in the performance. The drug-taking gave him the perfect cover story if things went wrong.15

  Some psychologists have argued that self-handicapping can have short-term benefits. If you can pin a particular failure on, say, drinking too much, it cushions your self-esteem in the event of a poor result. But this misses the real lesson in all of this. What is the point of preserving self-esteem that is so brittle that it can’t cope with failure?

  Think back to the surgeons earlier in the book. They had healthy egos. They had enjoyed expensive educations and owned impressive certificates. They were widely revered by colleagues and patients. But this is precisely why the culture was so dangerous. Surgeons are often so keen to protect their self-esteem that they can’t admit their fallibility.

  Self-esteem, in short, is a vastly overvalued psychological trait. It can cause us to jeopardize learning if we think it might risk us looking anything less than perfect. What we really need is resilience: the capacity to face up to failure, and to learn from it. Ultimately, that is what growth is all about.

  • • •

  On the afternoon of June 30, 1998, David Beckham’s life changed forever. He was twenty-three years old and playing for England in his first World Cup in Saint-Étienne in central France. It was a crucial knockout match against Argentina for a place in the quarter-finals.

  The score was even at 2–2. More than 20 million of his countrymen were tuning in on television back home and tens of thousands more were watching in the stadium. For Beckham it was a dream to be out on the field of play representing his country.

  Two minutes into the second half, Beckham was in the middle of the pitch when he was hit hard from behind by Diego Simeone, an Argentinian player. He felt a knee go into his back and he was knocked flat. As Simeone got up, he tugged Beckham’s hair, and then patted him on the head.

  Beckham reacted immediately, flicking his leg toward his opponent. His foot traveled less than two feet, and made minimal contact with Simeone, but the Argentinian went down, clutching his thigh. Beckham instantly knew he had made a terrible mistake, and prepared for the worst. His stomach turned to ice as the referee raised a red card into the air.

  England would go on to lose the match on penalties. Beckham, who had been sent off and spent the rest of the game in the dressing room, knew that he would be in the line of fire from the British press. But nothing prepared him for the storm that was about to engulf him and his family.

  When the team arrived back at Heathrow Airport the next day, the twenty-three-year old was pursued relentlessly by cameras and journalists. He received bullets in the mail, his effigy was burned from a lamppost, and one national newspaper turned his face into a dartboard.

  The first match of the following season, he had to be escorted into the ground under police guard. Every time he touched the ball for Manchester United, opposing fans erupted in booing. He had made a small mistake in reacting to a poor challenge from an opponent at the World Cup, but he was treated almost like a criminal. Many commentators doubted he would last the season. As one journalist put it: “You have to fear for Beckham’s career. Nobody can expect him to come back from something like this.”

  As it turned out, Beckham had the finest season of his career. Manchester United won the Treble (the Premier League, the FA Cup and the Champions League), the first, and so far only, English club to achieve that feat. Beckham played in almost every game. At the end of the season he was voted second in the FIFA World Player of the Year awards behind Rivaldo of Brazil and Barcelona, and ahead of Batistuta, Zidane, Vieri, Figo, Shevchenko, and Raúl.

  His contributions were remarkable. He made sixteen assists in the league and seven in the Champions League. He scored vital goals, not least the opening strike in the historic FA Cup semifinal reply against Arsenal and an equalizer in the final game of the Premier League season against Spurs. He also took both corners when United scored twice during extra time to clinch the Champions League title from under the noses of Bayern Munich. It was a superb set of performances.

  But let us rewind to the very first game of that season, against Leicester. United were trailing 2–1 when they were awarded a free kick, just outside the area. It was a huge moment given what had happened just a few weeks earlier at Saint-Étienne. Beckham had been booed throughout the game by opposing fans. He would later say that his stomach tightened as he strode over to place the ball. But as he walked back to take the shot, he felt everything change. He said:

  It was only as I stepped up to take the free kick that I felt my willpower hardening. It would have been easy to be negative, to worry about the consequences, but I just felt that little bit of steel inside. Partly, it was the extraordinary support I had received [from United fans]. But it was also all the practice over the years: the thousands of free kicks I had taken in rain, sleet and snow. It gave me confidence.

  Adversity rarely comes in as public a form as that endured by Beckham in Saint-Étienne. But responding to adversity, coming back from failure, absolutely depends on how we regard the setback. Is it evidence that we lack what it takes? Does it mean we are not up to the job? This is the kind of response offered
by those in a Fixed Mindset. They are sapped by impediments, and often lose willpower. They try to avoid feedback, even when they can learn from it.

  But when you regard failure as a learning opportunity, when you trust in the power of practice to help you grow through difficulties, your motivation and self-belief are not threatened in anything like the same way. Indeed, you embrace failure as an opportunity to learn, whether about improving a vacuum cleaner, creating a new scientific theory, or developing a promising soccer career.

  “It was tough to get sent off, but I learned a valuable lesson,” Beckham told me. “Isn’t that what life is about?”

  Coda

  The Big Picture

  I

  Almost every society studied by historians has had its own ideas about the way the world works, often in the form of myths, religions, and superstitions. Primitive societies usually viewed these ideas as sacrosanct and often punished those who disagreed with death. Those in power didn’t want to be confronted with any evidence that they might be wrong.

  As the philosopher Bryan Magee put it: “The truth is to be kept inviolate and handed on unsullied from generation to generation. For this purpose, institutions develop—mysteries, priesthoods, and at an advanced stage, schools.”1 Schools of this kind never admitted to new ideas and expelled anyone who attempted to change the doctrine.2

  But at some point in human history this changed. Criticism was tolerated and even encouraged. According to the philosopher Karl Popper, this first occurred in the days of the ancient Greeks, but the precise historical claim is less important than what it meant in practice. The change ended the dogmatic tradition. It was, he says, the most important moment in intellectual progress since the discovery of language.

  And he is surely right. For centuries before the Greeks the entire weight of intellectual history was about preserving and defending established ideas: religious, practical, and tribal. This defensive tendency, seemingly so universal in human history, has been a subject of speculation for anthropologists over many years.

  But the answer, surely, is that ancient tribes were trapped in a Fixed Mindset. They thought that the truth had been revealed by a god or god-like ancestor and did not feel any need to build new knowledge. New evidence was regarded not as an opportunity to learn fresh truths, but as a threat to the established worldview.

  Indeed, those who questioned traditional assumptions were often met with violence. History is full of episodes where ideas were tested not rationally but militarily. According to Encyclopaedia of Wars by Charles Phillips and Alan Axelrod, 123 conflicts in human history can be traced directly to differences in opinion, whether religious, ideological, or doctrinal.3

  Think back to cognitive dissonance. This is where dissenting evidence is reframed or ignored. Wars of ideology can be seen as an extreme form of dissonant reduction: instead of shutting your ears to inconvenient evidence, you murder the dissenters. This is a sure-fire way to guarantee that religious and traditional assumptions are not challenged, but it also torpedoes any possibility of progress.

  But the Greek period challenged all this. As the philosopher Bryan Magee put it: “It spelled the end of the dogmatic tradition of passing on an unsullied truth, and the beginning of a new rational tradition of subjecting speculations to critical discussion. It was the inauguration of scientific method. Error was turned from disaster to advantage.”4

  It is difficult to exaggerate the significance of that last sentence. Error, under the Greeks, was no longer catastrophic, or threatening, or worth killing over. On the contrary, if someone had persuasive evidence revealing the flaws in your beliefs, it was an opportunity to learn, to revise your model of the world. Scientific knowledge was seen as dynamic rather than static; something that grows through critical investigation, rather than handed down by authorities. As Xenophanes wrote:

  The gods did not reveal, from the beginning,

  All things to us, but in the course of time,

  Through seeking we may learn and know things better.

  This subtle shift had truly staggering effects. The Greek period inspired the greatest flowering of knowledge in human history, producing the forefathers of the entire Western intellectual tradition, including Socrates, Plato, Aristotle, Pythagoras and Euclid. It changed the world in ways both subtle and profound. As Benjamin Farrington, former professor of classics at Swansea University, put it:

  With astonishment we find ourselves on the threshold of modern science. Nor should it be supposed that by some trick of translation the extracts [from ancient Greek manuscripts] have been given an air of modernity. Far from it. The vocabulary of these writings and their style are the source from which our own vocabulary and style have been derived.

  But this period was tragically not to last. Looking back from our vantage point, it is astonishing just how suddenly the advance in human knowledge ground to a halt. For much of the time between the Greeks and the seventeenth century, Western science remained in a cul-de-sac, a point that has been powerfully made by the philosopher, scientist, and politician Francis Bacon.

  As Bacon wrote in Novum Organum, his masterpiece, in 1620: “The sciences which we possess come for the most part from the Greeks. [But] from all these systems of the Greeks, and their ramifications through particular sciences, there can hardly after the lapse of so many years be adduced a single experiment which tends to relieve and benefit the condition of man.”5

  This was a truly devastating assessment. The key argument here is that science had come up with almost nothing to “benefit the condition of man.” To us, accustomed to the way science transforms human life, this seems remarkable. But in Bacon’s time, this was the way it had been for generations. Scientific progress just didn’t happen.

  Why this halt in progress? The answer is not difficult to identify: the world drifted back into the old mindset. The teachings of the early church were brought together with the philosophy of Aristotle (who had been elevated to a revered authority) to create a new, sacrosanct worldview. Anything that contradicted Christian teaching was considered blasphemous. Dissenters were punished. Error had, once again, become disastrous.

  Perhaps the most extraordinary example of how inconvenient evidence was ignored or reframed relates to the Judeo-Christian idea that women have one more rib than men, drawn from the scriptural passage in Genesis that Eve was created from Adam’s rib. This could have been disproven at any time by doing something very simple: counting. The fact that men and women have the same number of ribs is just obvious.

  And yet this “truth” was generally accepted all the way until 1543, until contradicted by the Flemish anatomist, Andreas Vesalius. This shows, once again, that when we are fearful of being wrong, when the desire to protect the status quo is particularly strong, mistakes can persist in plain sight almost indefinitely.

  Bacon’s towering achievement was to challenge the dogmatic conception of knowledge that had restrained mankind for centuries. Like the Greeks he argued that science was not about defending truths, but challenging them. It was about having the courage to experiment and learn. “The true and lawful goal of sciences is none other than this: that human life be endowed with new discoveries and powers,” he wrote.6

  He also warned against the dangers of confirmation bias:

  The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.7

  Bacon’s work, along with other great thinkers such as Galileo, set the stage for a second scientific revolution. Theories were subjected to experimental criticism. Creativity, as a direct
consequence, flourished. Testing the ideas of authority figures thoroughly was not considered disrespectful, but obligatory. Error had once again been transformed from disaster to advantage.

  The point here is not that the ideas and theories of our forebears are not worth having; quite the reverse. Theories that have been through a process of selection, rigorously tested rules of thumb, practical knowledge honed through long trial and error and countless failures, are of priceless importance.

  We are the beneficiaries of a rich intellectual legacy and, if the slate were wiped clean, if all the cumulative knowledge gained by our ancestors were to somehow disappear, we would be lost. As Karl Popper put it: “If we started with Adam [i.e., with the relatively small amount of knowledge of early mankind], we wouldn’t get any further than Adam did.”8

  But theories that claim to furnish knowledge of the world, that claim to have never failed, held in place by authority alone, are a different matter. It is these ideas, and the underlying belief that they are sacrosanct, that is so destructive. The scientific method is about pushing out the frontiers of our knowledge through a willingness to embrace error.

  Think back to Galileo’s disproof of Aristotle’s theory about heavier objects falling faster than lighter ones (perhaps apocryphally he did this by dropping balls from the Leaning Tower of Pisa). This was a crucial discovery, but it also symbolized the beautifully disrupting power of failure. A single controlled experiment had refuted the ideas of one of the most respected intellectual giants in history, setting the stage for new answers, new problems, and new discoveries.9

  But the battle between these two conceptions of the world—one revealed from above, the other discovered from below—continued to rage. When Galileo saw the phases of Venus and the mountains of the moon through his newly invented telescope, he proposed that the sun rather than the earth was the center of the universe.

 

‹ Prev