by Matthew Syed
As late as 2016, security experts were making the same point. In a column for the National Review, Fred Fleitz, a former CIA analyst who became chief of staff for the National Security Council under President Trump, criticised an initiative to increase diversity at the CIA. ‘Protecting our nation from such threats requires extremely competent and capable individuals to conduct intelligence operations and write analysis in challenging security and legal environments . . . The CIA’s mission is too serious to be distracted by social-engineering efforts.’
Part of the reluctance to recruit ethnic minorities was fear of counter-espionage, but it went far deeper. Those who called for a broader intake were shouted down for undermining excellence. The CIA should be about the brightest and the best! Defence is too important to allow diversity to trump ability! As one observer put it: ‘Political correctness should never be elevated above national security.’
What they didn’t realise was that this was a false, and perilous, dichotomy.
III
This is a book about diversity, about the power of bringing people together who think differently from one another. At one level, this might seem like a curious objective. Surely, we should aim to think correctly or accurately, not differently. One should only wish to think differently from other people when they are in the wrong. When other people are right, thinking differently will only lead you into error. This seems like common sense.
Another seemingly commonsensical statement was that made by Justice Scalia. He argued that recruiting people because they are different, in one way or another, is to jeopardise performance. You should hire people because they are smart, or knowledgeable or fast. Why would you hire people who are less knowledgeable, fast or talented, just because they are different?
In the coming pages, we will show that both these intuitions are false, at least when it comes to the challenging problems we care most about. If we are intent upon answering our most serious questions, from climate change to poverty, and curing diseases to designing new products, we need to work with people who think differently, not just accurately. And this requires us to take a step back and view performance from a fundamentally different vantage point.
Consider an irony in the way we traditionally think about success. If you look at science or, indeed, popular literature, the focus is on individuals. How can we improve the knowledge or perceptiveness of ourselves or our colleagues? Fine books such as Peak by Anders Ericsson and Robert Pool, Sources of Power by Gary Klein and Mindset by Carol Dweck have become bestsellers. All examine, in their different ways, how we can improve individual abilities through time.
A host of other excellent books follow this approach but in a slightly different way. Even when we have developed expertise, we can still be vulnerable to biases and quirks that undermine our capacity to make wise judgements. Thinking, Fast and Slow by Daniel Kahneman, Predictably Irrational by Dan Ariely and Misbehaving by Richard Thaler all seek to improve performance by understanding these biases, and how to guard against them.
But by focusing on individuals, there has been a tendency to overlook what we might call the ‘holistic perspective’. A good way to understand the difference is to consider a colony of ants. A naive entomologist might seek to understand the colony by examining the ants within the colony. Individual ants, after all, deploy a vast range of behaviours, such as collecting leaves, marching, etc. They are busy and fascinating creatures. And yet you could spend a year, indeed a lifetime, examining individuals and learn virtually nothing of the colony. Why? Because the interesting thing about ants is not the parts but the whole. Instead of zooming in on individual ants, the only way to understand the colony is to zoom out. One step removed, you can comprehend the colony as a coherent organism, capable of solving complex problems such as building sophisticated homes and finding sources of food. An ant colony is an emergent system. The whole is greater than the sum of its parts.
This book will argue that a similar irony applies to human groups. Pretty much all the most challenging work today is undertaken in groups for a simple reason: problems are too complex for any one person to tackle alone. The number of papers written by individual authors has declined year by year in almost all areas of academia. In science and engineering, 90 per cent of papers are written by teams. In medical research, collaborations outnumber individual papers by three to one.
In business, we see the same trend. A team led by Brian Uzzi, a psychologist at Kellogg School of Management, examined more than two million patents issued by the United States since 1975 and found that teams are dominant in every single one of the thirty-six categories. The same trend is seen in the marketplace. Twenty-five years ago, most equity funds were managed by individuals. Now, the vast majority are run by teams. ‘The most significant trend in human creativity is the shift from individuals to teams, and the gap between teams and individuals is increasing with time’, Uzzi writes.
And this is why the holistic perspective is so imperative. We need to think of human performance not from the standpoint of the individual but from the standpoint of the group. From this more rounded perspective, we’ll see that diversity is the critical ingredient driving what we might term collective intelligence.
There are, of course, many types of diversity. Differences in gender, race, age and religion are sometimes classified under the heading ‘demographic diversity’ (or ‘identity diversity’). We will be focusing not upon demographic diversity, but cognitive diversity. That’s to say, differences in perspective, insights, experiences and thinking styles. There is often (but not always) an overlap between these two concepts. People from different backgrounds, with different experiences, often think about problems in different ways. We will analyse the precise relationship later in the book.
Cognitive diversity was not so important a few hundred years ago, because the problems we faced tended to be linear, or simple, or separable, or all three. A physicist who can accurately predict the position of the moon doesn’t need a different opinion to help her do her job. She is already bang on the money. Any other opinion is false. This goes back to our common-sense intuition. Thinking differently is a distraction. With complex problems, however, this logic flips. Groups that contain diverse views have a huge, often decisive, advantage.
Another point worth noting is that these are not speculative claims; rather, they emerge from rigorous, if initially puzzling, axioms. Indeed, as Scott Page, an expert in complexity science at the University of Michigan, Ann Arbor has pointed out, these axioms apply as much to computers as to humans. As we shall see, artificial intelligence today is no longer about single algorithms, however sophisticated. Rather, it is about ensembles of algorithms that ‘think’ differently, search differently and encode problems in diverse ways.
Over the coming pages, the contours of a new science will emerge. Our journey will take us to some unusual destinations: the death zone at the summit of Mount Everest, the American neo-Nazi movement after the 2008 Presidential Election and sub-Saharan Africa at the dawn of our species. We will see why the US Air Force endured so many crashes in the early 1950s, how the Dutch reinvented football and why most diets suit almost nobody. We will look at success stories, peeling back the layers of how they happened and examining their hidden logic. We will look at seminal failures, too. Often, it is looking at what went wrong that can provide the most vivid pointers about how to get things right.
By the end of the book, we will be equipped with a fresh perspective on how success happens, one with implications not just for governments and business, but for all of us. Harnessing the power of cognitive diversity is set to become a key source of competitive advantage, and the surest route to reinvention and growth. You might even say that we are entering the age of diversity.
But let’s start out by looking at a selection of puzzles and thought experiments. These will help to shed light on what cognitive differences mean, and why they matter. We will then return to the build-up to 9/11 and one of the defining intel
ligence failures of modern times. Often, it is real-world examples that shine the greatest light of all.
IV
In 2001, Richard E. Nisbett and Takahiko Masuda, two social psychologists from the University of Michigan at Ann Arbor, took two groups – one from Japan and the other from the United States – and showed them video clips from underwater scenes. When asked to describe what they had seen, the Americans talked about the fish. They seemed able to recall high levels of detail about the objects. They said things like: ‘Well, I saw three big fish swimming off to the left, they had white bellies, and pink dots.’ The Japanese, on the other hand, overwhelmingly talked about the context rather than the objects: ‘I saw what looked like a stream, the water was green, there were rocks and shells and plants on the bottom . . . Oh, and there were three fish swimming off to the left.’16
To the experimenters, it was as if the group were seeing different scenes, shaped by differences in culture. America is a more individualistic society; Japanese culture is more interdependent. Americans tend to focus on objects; Japanese on context.
In the next stage of the experiment, the subjects were shown new underwater scenes, with some objects they had seen before and some they had not. When the initial objects were placed in a different context, this threw the Japanese. They struggled to recognise the objects. It was as if the new context diverted their attention. The Americans, on the other hand, had the opposite problem. They were blind to changes in the context.
To the researchers, this was a profoundly surprising result. For decades, a central tenet of psychology was that humans apprehend the world in fundamentally similar ways. This is called ‘universalism’. As Nisbett put it: ‘I had been a lifelong universalist concerning the nature of human thought . . . Everyone has the same basic cognitive processes. Maori herders, !Kung hunter-gatherers, and dotcom entrepreneurs all rely on the same tools for perception, memory, causal analysis . . . etc.’
But the underwater experiment showed that even in our most direct interaction with the world – the act of looking at it – there are systematic differences shaped by culture. Nisbett’s paper has now been cited more than a thousand times and inspired a thriving research programme. We might say, taking a step back, that Americans and Japanese operate with a different ‘frame of reference’. The Americans – on average and acknowledging differences within the group – have a more individualistic frame. The Japanese, on the other hand, have a more contextual frame. Each frame attends to useful information. Each frame picks out important features of the underwater scene. Each frame also contains blind spots. The pictures are incomplete.
But now suppose you were to combine a Japanese and an American in a ‘team’. Alone, they might perceive only a partial picture. Alone, they each miss aspects of the scene. Together, however, they are able to recount both objects and context. By combining two partial frames of reference, the overall picture snaps into focus. They now have a more comprehensive grasp of reality.
This experiment is a first, tentative attempt at gently pushing back on one of the intuitions mentioned earlier. You’ll remember that Judge Scalia argued that organisations could choose diversity or they could ‘choose to be super-duper’. This implied a trade-off between diversity and excellence. And this is certainly true in a linear task like running (or predicting the orbit of the moon).
And yet the underwater scene experiment hints that, in different contexts, this logic begins to fray. If two people have perspectives that are incomplete, joining them together can yield more insight, not less. They are both wrong, so to speak. They both miss something. But they are wrong in different directions. This means that their shared picture is richer and more accurate. You can glimpse this in a slightly different way by examining a fresh problem, this time something called an ‘insight puzzle’. Consider the following teaser:
Suppose you are a doctor faced with a patient who has a malignant tumour in his stomach. It is impossible to operate on the patient, but unless the tumour is destroyed the patient will die. There is a kind of ray that can be used to destroy the tumour. If the rays reach the tumour all at once at a sufficiently high intensity, the tumour will be destroyed. Unfortunately, at this intensity the healthy tissue that the rays pass through on the way to the tumour will also be destroyed. At lower intensities the rays are harmless to healthy tissue, but they will not affect the tumour, either. What type of procedure might be used to destroy the tumour with the rays and at the same time avoid destroying the healthy tissue?17
If you can’t solve this puzzle, you are not alone. More than 75 per cent of people say that there is no solution, and that the patient will die. But now read the following, seemingly unrelated, story:
A fortress was situated in the middle of the country, surrounded by farms and villages. Many roads led to the fortress through the countryside. A rebel general vowed to capture the fortress but learned that mines had been planted on each of the roads. The mines were set so that small bodies of men could pass over them safely, but any large force would detonate them. The general divided his armies into small groups and dispatched each group to the head of a different road. When all was ready, he gave the signal and each group marched down a different road. Each group continued down its road so that the entire army arrived together at the fortress at the same time. In this way, the general captured the fortress.18
Now, think back to the medical problem. Can you see the solution now? When tested, more than 70 per cent of people found a way to save the patient having read the story about the fortress, treble the initial number. Somehow, by hearing the analogy of the fortress, they were able to glimpse a solution that had previously eluded them. (The solution is to set multiple ray guns around the patient to deliver 10 per cent of the radiation with each gun. This destroys the tumour, but without the rays harming healthy tissue.)
This is, of course, an artificial example. But it nevertheless offers a sense of how different perspectives can contribute to solving a challenging problem – in this case, someone with a military background might be of assistance to an oncologist. In such examples, it is not so much a case of one person being right and another wrong. Rather, it is a case of how looking at a problem through different lenses can jog new insights, new metaphors – and new solutions.
This example challenges intuition in another way, too. When faced with a difficult medical problem, the temptation is to recruit more and more doctors. After all, doctors have the most medical knowledge. But if these experts bring similar backgrounds and training (and, by implication, similar frames of reference), they are likely to share the same blind spots. Sometimes you need to look at a problem in a new way, perhaps with the eyes of an outsider.
The critical point is that solutions to complex problems typically rely on multiple layers of insight and therefore require multiple points of view. The great American academic Philip Tetlock puts it this way: ‘The more diverse the perspectives, the wider the range of potentially viable solutions a collection of problem solvers can find.’ The trick is to find people with different perspectives that usefully impinge on the problem at hand.
V
Before resuming our analysis of 9/11, let us briefly examine another area of research that will prove central to this book: ‘perspective blindness’. This refers to the fact that we are oblivious to our own blind spots. We perceive and interpret the world through frames of reference but we do not see the frames of reference themselves. This, in turn, means that we tend to underestimate the extent to which we can learn from people with different points of view.
Perspective blindness was the subject of David Foster Wallace’s address to Kenyon College in 2005, rated by Time magazine as one of the greatest commencement speeches ever recorded. The speech starts in a fish tank. ‘There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says, “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of
them looks over at the other and goes, “What the hell is water?” ’
Wallace’s point is our modes of thought are so habitual that we scarcely notice how they filter our perception of reality. The danger arises when we overlook the fact that in most areas of life there are other people, with different ways of looking at things, who might deepen our own understanding, just as we might deepen theirs. John Cleese, the British comedian, put it this way: ‘Everybody has theories. The dangerous people are those who are not aware of their own theories. That is, the theories on which they operate are largely unconscious.’
The journalist Reni Eddo-Lodge has offered many examples of perspective blindness. In one, she describes a period when she couldn’t afford to take the train all the way to work, so had to cycle part of the way instead. The experience opened a new window on the world:
An uncomfortable truth dawned upon me as I lugged my bike up and down flights of stairs in commuter-town train stations: the majority of public transport I’d been travelling on was not easily accessible. No ramps. No lifts. Nigh-on impossible to access for parents with buggies, or people using wheelchairs, or people with mobility issues, like a frame or a cane. Before I’d had my own wheels to carry, I’d never noticed this problem. I’d been oblivious to the fact that this lack of accessibility was affecting hundreds of people.19
This experience provided her with a perspective that she had not merely lacked previously, but didn’t know that she lacked. It opened her eyes to a blind spot about her blind spots. This example doesn’t imply, of course, that all commuter stations should necessarily be equipped with ramps, stairs or lifts. But it does show that we can only perform a meaningful cost-benefit analysis if the costs and benefits are perceived. We have to see things before we can make sense of them. This, in turn, hinges on differences in perspective. People who can help us to see our own blind spots, and who we can help to see theirs.