The Intelligence Trap

Home > Other > The Intelligence Trap > Page 28
The Intelligence Trap Page 28

by David Robson


  Each generator is visited by a team of inspectors every two years, each visit lasting five to six weeks. Although one-third of INPO’s inspectors are permanent staff, the majority are seconded from other power plants, leading to a greater sharing of knowledge between organisations, and the regular input of an outside perspective in each company. INPO also actively facilitates discussions between lower-level employees and senior management with regular review groups. This ensures that the fine details and challenges of day-to-day operations are acknowledged and understood at every level of the hierarchy.

  To increase accountability, the results of the inspections are announced at an annual dinner – meaning that ‘You get the whole top level of the utility industry focused on the poor performer’, according to one CEO quoted in the Presidential Commission’s report. Often, CEOs in the room will offer to loan their expertise to bring other generators up to scratch. The result is that every company is constantly learning from each other’s mistakes. Since INPO began operating, US generators have seen a tenfold reduction in the number of worker accidents.35

  You need not be a fan of nuclear power to see how these structures maximise the collective intelligence of employees across the industry and greatly increase each individual’s awareness of potential risks, while reducing the build-up of those small, unacknowledged errors that can lead to catastrophe. INPO shows the way that regulatory bodies can help mindful cultures to spread across organisations, uniting thousands of employees in their reflection and critical thinking.

  The oil industry has not (yet) implemented a comparably intricate system, but energy companies have banded together to revise industry standards, improve worker training and education, and upgrade their technology to better contain a spill, should it occur. BP has also funded a huge research programme to deal with the environmental devastation in the Gulf of Mexico. Some lessons have been learnt – but at what cost?36

  The intelligence trap often emerges from an inability to think beyond our expectations – to imagine an alternative vision of the world, where our decision is wrong rather than right. This must have been the case on 20 April 2010; no one can possibly have considered the true scale of the catastrophe they were letting loose.

  Over the subsequent months, the oil slick would cover more than 112,000 km2 of the ocean’s surface – an area that is roughly 85 per cent the size of England.37 According to the Center for Biological Diversity, the disaster killed at least 80,000 birds, 6,000 sea turtles and 26,000 marine mammals – an ecosystem destroyed by preventable errors. Five years later, baby dolphins were still being born with under-developed lungs, due to the toxic effects of the oil leaked into the water and the poor health of their parents. Only 20 per cent of dolphin pregnancies resulted in a live birth.38

  That’s not to mention the enormous human cost. Besides the eleven lives lost on the rig itself and the unimaginable trauma inflicted on those who escaped, the spill devastated the livelihoods of fishing communities in the Gulf. Two years after the spill, Darla Rooks, a lifelong fisherperson from Port Sulfur, Louisiana, described finding crabs ‘with holes in their shells, shells with all the points burned off so all the spikes on their shells and claws are gone, misshapen shells, and crabs that are dying from within . . . they are still alive, but you open them up and they smell like they’ve been dead for a week’.

  The level of depression in the area rose by 25 per cent over the following months, and many communities struggled to recover from their losses. ‘Think about losing everything that makes you happy, because that is exactly what happens when someone spills oil and sprays dispersants on it,’ Rooks told Al Jazeera in 2012.39 ‘People who live here know better than to swim in or eat what comes out of our waters.’

  This disaster was entirely preventable – if only BP and its partners had recognised the fallibility of the human brain and its capacity for error. No one is immune, and the dark stain in the Gulf of Mexico should be a constant reminder of the truly catastrophic potential of the intelligence trap.

  Epilogue

  We began this journey with the story of Kary Mullis – the brilliant chemist who has dabbled in astrology and astral projection, and even defended AIDS denialism. It should now be clear how factors such as motivated reasoning could have led him to ignore every warning sign.

  But I hope it has become clear that The Intelligence Trap is so much more than the story of any individual’s mistakes. The trap is a phenomenon that concerns us all, given the kinds of thinking that we, as a society, have come to appreciate, and the ones we have neglected.

  Interviewing so many brilliant scientists for this book, I came to notice that each expert seemed, in some way, to embody the kind of intelligence or thinking that they’ve been studying. David Perkins was unusually thoughtful, frequently pausing our conversation to reflect before we continued; Robert Sternberg, meanwhile, was tremendously pragmatic in conveying his message; Igor Grossmann was extremely humble and took extra care to emphasise the limits of his knowledge; and Susan Engel was animated with endless curiosity.

  Perhaps they were attracted to their field because they wanted to understand their own thinking better; or perhaps their own thinking came to resemble the subject of their study. Either way, to me it was one more illustration of the enormous range of potential thinking styles available to us, and the benefits they bring.

  James Flynn describes the rise in IQ over the twentieth century as our ‘cognitive history’; it shows the ways our minds have been moulded by the society around us. But it strikes me that if each of these scientists had been able to present and promote their work in the early nineteenth century, before the concept of general intelligence came to determine the kind of thinking that was considered ‘smart’, our cognitive history might have been very different. As it is, the abstract reasoning measured by IQ tests, SATs and GREs still dominates our understanding of what constitutes intelligence.

  We don’t need to deny the value of those skills, or abandon the learning of factual knowledge and expertise, to accept that other ways of reasoning and learning are equally deserving of our attention. Indeed, if I have learnt anything from this research, it is that cultivating these other traits often enhances the skills measured by standard tests of cognitive ability, as well as making us more rounded and wiser thinkers.

  Study after study has shown that encouraging people to define their own problems, explore different perspectives, imagine alternative outcomes to events, and identify erroneous arguments can boost their overall capacity to learn new material while also encouraging a wiser way of reasoning.1

  I found it particularly encouraging that learning with these methods often benefits people across the intelligence spectrum. They can reduce motivated reasoning among the highly intelligent, for instance, but they can also improve the general learning of people with lower intelligence. One study by Bradley Owens at the State University of New York in Buffalo, for instance, found that intellectual humility predicted academic achievement better than an IQ test. Everyone with higher intellectual humility performed better, but – crucially – it was of most benefit for those with lower intelligence, completely compensating for their lower ‘natural’ ability.2 The principles of evidence-based wisdom can help anyone to maximise their potential.

  This new understanding of human thinking and reasoning could not have come at a more important time.

  As Robert Sternberg wrote in 2018: ‘The steep rise in IQ has bought us, as a society, much less than anyone had any right to hope for. People are probably better at figuring out complex cell phones and other technological innovations than they would have been at the turn of the twentieth century. But in terms of our behaviour as a society, are you impressed with what 30 points has brought us?’3

  Although we have made some strides in areas such as technology and healthcare, we are no closer to solving pressing issues such as climate change or social inequality – and the increasingly dogmatic views that often come with the intelligence trap only
stand in the way of the negotiations between people with different positions that might lead to a solution. The World Economic Forum has listed increasing political polarisation and the spread of misinformation in ‘digital wildfires’4 as two of the greatest threats facing us today – comparable to terrorism and cyber warfare.

  The twenty-first century presents complex problems that require a wiser way of reasoning, one that recognises our current limitations, tolerates ambiguity and uncertainty, balances multiple perspectives, and bridges diverse areas of expertise. And it is becoming increasingly clear that we need more people who embody those qualities.

  This may sound like wishful thinking, but remember that American presidents who scored higher on scales of open-mindedness and perspective taking were far more likely to find peaceful solutions to conflict. It’s not unreasonable to ask whether, given this research, we should be actively demanding those qualities in our leaders, in addition to more obvious measures of academic achievement and professional success.

  If you want to apply this research yourself, the first step is to acknowledge the problem. We have now seen how intellectual humility can help us see through our bias blind spot, form more rational opinions, avoid misinformation, learn more effectively, and work more productively with the people around us. As the philosopher Valerie Tiberius, who is now working with psychologists at the Chicago Center for Practical Wisdom, points out, we often spend huge amounts of time trying to boost our self-esteem and confidence. ‘But I think that if more people had some humility about what they know and don’t know, that would go a tremendous distance to improving life for everyone.’

  To this end, I have included a short ‘taxonomy’ of definitions in the appendix, outlining the most common errors at the heart of the intelligence trap and some of the best ways to deal with them. Sometimes, just being able to put a label on your thinking opens the door to a more insightful frame of mind. I have found that it can be an exhilarating experience to question your own intelligence in these ways, as you reject many of the assumptions you have always taken for granted. It allows you to revive the childlike joy of discovery that drove everyone from Benjamin Franklin to Richard Feynman.

  It is easy, as adults, to assume that we have reached our intellectual peak by the time we finish our education; indeed, we are often told to expect a mental decline soon after. But the work on evidence-based wisdom shows that we can all learn new ways of thinking. Whatever our age and expertise, whether a NASA scientist or a school student, we can all benefit from wielding our minds with insight, precision and humility.5

  Appendix: Taxonomies of Stupidity and Wisdom

  A Taxonomy of Stupidity

  Bias blind spot: Our tendency to see others’ flaws, while being oblivious to the prejudices and errors in our own reasoning.

  Cognitive miserliness: A tendency to base our decision making on intuition rather than analysis.

  Contaminated mindware: An erroneous baseline knowledge that may then lead to further irrational behaviour. Someone who has been brought up to distrust scientific evidence may then be more susceptible to quack medicines and beliefs in the paranormal, for instance.

  Dysrationalia: The mismatch between intelligence and rationality, as seen in the life story of Arthur Conan Doyle. This may be caused by cognitive miserliness or contaminated mindware.

  Earned dogmatism: Our self-perceptions of expertise mean we have gained the right to be closed-minded and to ignore other points of view.

  Entrenchment: The process by which an expert’s ideas become rigid and fixed.

  Fachidiot: Professional idiot. A German term to describe a one-track specialist who is an expert in their field but takes a blinkered approach to a multifaceted problem.

  Fixed mindset: The belief that intelligence and talent are innate, and exerting effort is a sign of weakness. Besides limiting our ability to learn, this attitude also seems to make us generally more closed-minded and intellectually arrogant.

  Functional stupidity: A general reluctance to self-reflect, question our assumptions, and reason about the consequences of our actions. Although this may increase productivity in the short term (making it ‘functional’), it reduces creativity and critical thinking in the long term.

  ‘Hot’ cognition: Reactive, emotionally charged thinking that may give full rein to our biases. Potentially one source of Solomon’s paradox (see below).

  Meta-forgetfulness: A form of intellectual arrogance. We fail to keep track of how much we know and how much we have forgotten; we assume that our current knowledge is the same as our peak knowledge. This is common among university graduates; years down the line, they believe that they understand the issues as well as they did when they took their final exams.

  Mindlessness: A lack of attention and insight into our actions and the world around us. It is a particular issue in the way children are educated.

  Moses illusion: A failure to spot contradictions in a text, due to its fluency and familiarity. For example, when answering the question, ‘How many animals of each kind did Moses take on the Ark?’, most people answer two. This kind of distraction is a common tactic for purveyors of misinformation and fake news.

  Motivated reasoning: The unconscious tendency to apply our brainpower only when the conclusions will suit our predetermined goal. It may include the confirmation or myside bias (preferentially seeking and remembering information that suits our goal) and discomfirmation bias (the tendency to be especially sceptical about evidence that does not fit our goal). In politics, for instance, we are far more likely to critique evidence concerning an issue such as climate change if it does not fit with our existing worldview.

  Peter principle: We are promoted based on our aptitude at our current job – not on our potential to fill the next role. This means that managers inevitably ‘rise to their level of incompetence’. Lacking the practical intelligence necessary to manage teams, they subsequently underperform. (Named after management theorist Laurence Peter.)

  Pseudo-profound bullshit: Seemingly impressive assertions that are presented as true and meaningful but are actually vacuous under further consideration. Like the Moses illusion, we may accept their message due to a general lack of reflection.

  Solomon’s paradox: Named after the ancient Israelite king, Solomon’s paradox describes our inability to reason wisely about our own lives, even if we demonstrate good judgement when faced with other people’s problems.

  Strategic ignorance: Deliberately avoiding the chance to learn new information to avoid discomfort and to increase our productivity. At work, for instance, it can be beneficial not to question the long-term consequences of your actions, if that knowledge will interfere with the chances of promotion. These choices may be unconscious.

  The too-much-talent effect: The unexpected failure of teams once their proportion of ‘star’ players reaches a certain threshold. See, for instance, the England football team in the Euro 2016 tournament.

  A Taxonomy of Wisdom

  Actively open-minded thinking: The deliberate pursuit of alternative viewpoints and evidence that may question our opinions.

  Cognitive inoculation: A strategy to reduce biased reasoning by deliberately exposing ourselves to examples of flawed arguments.

  Collective intelligence: A team’s ability to reason as one unit. Although it is very loosely connected to IQ, factors such as the social sensitivity of the team’s members seem to be far more important.

  Desirable difficulties: A powerful concept in education: we actually learn better if our initial understanding is made harder, not easier. See also Growth mindset.

  Emotional compass: A combination of interoception (sensitivity to bodily signals), emotion differentiation (the capacity to label your feelings in precise detail) and emotion regulation that together help us to avoid cognitive and affective biases.

  Epistemic accuracy: Someone is epistemically accurate if their beliefs are supported by reason and factual evidence.

  Epistemic curiosit
y: An inquisitive, interested, questioning attitude; a hunger for information. Not only does curiosity improve learning; the latest research shows that it also protects us from motivated reasoning and bias.

  Foreign language effect: The surprising tendency to become more rational when speaking a second language.

  Growth mindset: The belief that talents can be developed and trained. Although the early scientific research on mindset focused on its role in academic achievement, it is becoming increasingly clear that it may drive wiser decision making, by contributing to traits such as intellectual humility.

  Intellectual humility: The capacity to accept the limits of our judgement and to try to compensate for our fallibility. Scientific research has revealed that this is a critical, but neglected, characteristic that determines much of our decision making and learning, and which may be particularly crucial for team leaders.

  Mindfulness: The opposite of mindlessness. Although this can include meditative practice, it refers to a generally reflective and engaged state that avoids reactive, overly emotional responses to events and allows us to note and consider our intuitions more objectively. The term may also refer to an organisation’s risk management strategy (see Chapter 10).

 

‹ Prev