Book Read Free

The Intelligence Trap

Page 26

by David Robson


  Spicer told me that his interest stems from his PhD at the University of Melbourne, during which time he studied decision making at the Australian Broadcasting Corporation (ABC).5 ‘They would introduce these crazy change management programmes, which would often result in nothing changing except creating a huge amount of uncertainty.’

  Many employees acknowledged the flaws in the corporation’s decision making. ‘You found a lot of very smart people thrown together in an organisation and many of them would spend a lot of time complaining how stupid the organisation was,’ Spicer told me. What really surprised him, however, was the number of people who failed to acknowledge the futility of what they were doing. ‘These extremely high-skilled and knowledgeable professionals were getting sucked into these crazy things, saying “this is intelligent, this is rational”, then wasting an incredible amount of time.’*

  * The same culture can also be seen in the BBC’s offices – a fact the broadcaster itself lampoons in its mockumentary TV series, W1A. Having worked at the BBC while researching this book, it occurs to me that deciding to create a three-series sitcom about your own organisational failings – rather than fixing them – is perhaps the definition of functional stupidity.

  Years later he would discuss such organisational failings with Alvesson at a formal academic dinner. In their resulting studies, the pair of researchers examined dozens of other examples of organisational stupidity, from the armed forces to IT analysts, newspaper publishers and their own respective universities, to examine whether many institutions really do make the most of their staff’s brains.

  Their conclusions were deeply depressing. As Alvesson and Spicer wrote in their book, The Stupidity Paradox: ‘Our governments spend billions on trying to create knowledge economies, our firms brag about their superior intelligence, and individuals spend decades of their lives building up fine CVs. Yet all this collective intellect does not seem to be reflected in the many organisations we studied . . . Far from being “knowledge-intensive”, many of our most well-known chief organisations have become engines of stupidity.’6

  In parallel with the kinds of biases and errors behind the intelligence trap, Spicer and Alvesson define ‘stupidity’ as a form of narrow thinking lacking three important qualities: reflection about basic underlying assumptions, curiosity about the purpose of your actions, and a consideration of the wider, long-term consequences of your behaviours.7 For many varied reasons, employees simply aren’t being encouraged to think.

  This stupidity is often functional, they say, because it can come with some benefits. Individuals may prefer to go with the flow in the workplace to save effort and anxiety, particularly if we know there will be incentives or even a promotion in this for us later. Such ‘strategic ignorance’ is now well studied in psychological experiments where participants must compete for money: often participants choose not to know how their decisions affect the other players.8 By remaining in the dark, the player gains some ‘moral wiggle room’ (the scientific term) that allows them to act in a more selfish way.

  We might also be persuaded by social pressure: no one likes a trouble-maker, after all, who delays meetings with endless questions. Unless we are actively encouraged to share our views, staying quiet and nodding along with the people around us can improve our individual prospects – even if that means temporarily turning off our critical capacities.

  Besides helping the individual, this kind of narrow-minded, unquestioning approach can also bring some immediate benefits for the organisation, increasing productivity and efficiency in the short term without the employees wasting time questioning the wisdom of their behaviours. The result is that some companies may – either accidentally or deliberately – actually encourage functional stupidity within their offices.

  Spicer and Alvesson argue that many work practices and structures contribute to an organisation’s functional stupidity, including excessive specialisation and division of responsibilities. A human resources manager may now have the very particular, single task of organising personality tests, for instance. As the psychological research shows us, our decision making and creativity benefits from hearing outside perspectives and drawing parallels between different areas of interest; if we mine the same vein day after day, we may begin to pay less attention to the nuances and details. The German language, incidentally, has a word for this: the Fachidiot, a one-track specialist who takes a single-minded, inflexible approach to a multifaceted problem.

  But perhaps the most pervasive – and potent – source of functional stupidity is the demand for complete corporate loyalty and an excessive focus on positivity, where the very idea of criticism may be seen as a betrayal, and admitting disappointment or anxiety is considered a weakness. This is a particular bugbear for Spicer, who told me that relentless optimism is now deeply embedded in many business cultures, stretching from start-ups to huge multinationals.

  He described research on entrepreneurs, for instance, who often cling to the motto that they will ‘fail forward’ or ‘fail early, fail often’. Although these mottos sound like an example of the ‘growth mindset’ – which should improve your chances of success in the future – Spicer says that entrepreneurs often look to explain their failings with external factors (‘my idea was before its time’) rather than considering the errors in their own performance, and how it might be adapted in the future. They aren’t really considering their own personal growth.

  The numbers are huge: between 75 and 90 per cent of entrepreneurs lose their first businesses – but by striving to remain relentlessly upbeat and positive, they remain oblivious to their mistakes.9 ‘Instead of getting better – which this “fail forward” idea would suggest – they actually get worse over time,’ Spicer said. ‘Because of these self-serving biases, they just go and start a new venture and make exactly the same mistakes over and over again . . . and they actually see this as a virtue.’

  The same attitude is prevalent among much larger and more established corporations, where bosses tell their employees to ‘only bring me the good news’. Or you may attend a brainstorming session, where you are told that ‘no idea is a bad idea’. Spicer argues that this is counter-productive; we are actually more creative when we take on board a criticism at an early stage of a discussion. ‘You’ve tested the assumptions and then you are able to enact upon them, instead of trying to push together ideas to cover up any differences.’

  I hope you will now understand the intelligence trap well enough to see immediately some of the dangers of this myopic approach.

  The lack of curiosity and insight is particularly damaging during times of uncertainty. Based on his observations in editorial meetings, for instance, Alvesson has argued that overly rigid and unquestioning thinking of this kind prevented newspapers from exploring how factors like the economic climate and rising taxes were influencing their sales; editors were so fixated on examining specific headlines on their front pages that they forgot even to consider the need to explore broader new strategies or outlets for their stories.

  But Nokia’s implosion in the early 2010s offers the most vivid illustration of the ways that functional stupidity can drive an outwardly successful organisation to failure.

  If you owned a cellphone in the early 2000s, chances are that it was made by the Finnish company. In 2007, they held around half the global market share. Six years later, however, most of their customers had turned away from the clunky Nokia interface to more sophisticated smartphones, notably Apple’s iPhone.

  Commentators at the time suggested that Nokia was simply an inferior company with less talent and innovation than Apple, that the corporation had been unable to see the iPhone coming, or that they had been complacent, assuming that their own products would trump any others.

  But as they investigated the company’s demise, the Finnish and Singaporean researchers Timo Vuori and Quy Huy found that none of this was true.10 Nokia’s engineers were among the best in the world, and they were fully aware of the risks ahead. Ev
en the CEO himself had admitted, during an interview, that he was ‘paranoid about all the competition’. Yet they nevertheless failed to rise to the occasion.

  One of the biggest challenges was Nokia’s operating system, Symbian, which was inferior to Apple’s iOS and unsuitable for dealing with sophisticated touchscreen apps, but overhauling the existing software would take years of development, and the management wanted to be able to present their new products quickly, leading them to rush through projects that needed greater forward planning.

  Unfortunately, employees were not allowed to express any doubts about the way the company was proceeding. Senior managers would regularly shout ‘at the tops of their lungs’ if you told them something they did not want to hear. Raise a doubt, and you risked losing your job. ‘If you were too negative, it would be your head on the block,’ one middle manager told the researchers. ‘The mindset was that if you criticise what’s being done, then you’re not genuinely committed to it,’ said another.

  As a consequence, employees began to feign expertise rather than admitting their ignorance about the problems they were facing, and accepted deadlines that they knew would be impossible to maintain. They would even massage the data showing their results so as to give a better impression. And when the company lost employees, it deliberately hired replacements with a ‘can do’ attitude – people who would nod along with new demands rather than disagreeing with the status quo. The company even ignored advice from external consultants, one of whom claimed that ‘Nokia has always been the most arrogant company ever towards my colleagues.’ They lost any chance of an outside perspective.

  The very measures that were designed to focus the employee’s attention and encourage a more creative outlook were making it harder and harder for Nokia to step up to the competition.

  As a result, the company consistently failed to upgrade its operating system to a suitable standard – and the quality of Nokia’s products slowly deteriorated. By the time the company launched the N8 – their final attempt at an ‘iPhone Killer’ – in 2010, most employees had secretly lost faith. It flopped, and after further losses Nokia’s mobile phone business was acquired by Microsoft in 2013.

  The concept of functional stupidity is inspired by extensive observational studies, including an analysis of Nokia’s downfall, rather than psychological experiments, but this kind of corporate behaviour shows clear parallels with psychologists’ work on dysrationalia, wise reasoning and critical thinking.

  You might remember, for instance, that feelings of threat trigger the so-called ‘hot’, self-serving cognition that leads us to justify our own positions rather than seeking evidence that challenges our point of view – and this reduces scores of wise reasoning. (It is the reason we are wiser when advising a friend about a relationship problem, even if we struggle to see the solution to our own troubles.)

  Led by its unyielding top management, Nokia as an organisation was therefore beginning to act like an individual, faced with uncertain circumstances, whose ego has been threatened. Nokia’s previous successes, meanwhile, may have given it a sense of ‘earned dogmatism’, meaning that managers were less open to suggestions from experts outside the company.

  Various experiments from social psychology suggest that this is a common pattern: groups under threat tend to become more conformist, single-minded and inward looking. More and more members begin to adopt the same views, and they start to favour simple messages over complex, nuanced ideas. This is even evident at the level of entire nations: newspaper editorials within a country tend to become more simplified and repetitive when it faces international conflict, for instance.11

  No organisation can control its external environment: some threats will be inevitable. But organisations can alter the way they translate those perceived dangers to employees, by encouraging alternative points of view and actively seeking disconfirming information. It’s not enough to assume that employing the smartest people possible will automatically translate to better performance; you need to create the environment that allows them to use their skills.

  Even the companies that appear to buck these trends may still incorporate some elements of evidence-based wisdom – although it may not be immediately obvious from their external reputation. The media company Netflix, for instance, famously has the motto that ‘adequate performance earns a generous severance’ – a seemingly cut-throat attitude that might promote myopia and short-term gains over long-term resilience.

  Yet they seem to balance this with other measures that are in line with the broader psychological research. A widely circulated presentation outlining Netflix’s corporate vision, for example, emphasises many of the elements of good reasoning that we have discussed so far, including the need to recognise ambiguity and uncertainty and to challenge prevailing opinions – exactly the kind of culture that should encourage wise decision making.12

  We can’t, of course, know how Netflix will fare in the future. But its success to date would suggest that you can avoid functional stupidity while also running an efficient – some would say ruthless – operation.

  The dangers of functional stupidity do not end with these instances of corporate failure. Besides impairing creativity and problem solving, a failure to encourage reflection and internal feedback can also lead to human tragedy, as NASA’s disasters show.

  ‘Often it leads to a number of small mistakes being made, or the [company] focuses on the wrong problems and overlooks a problem where there should have been some sort of post mortem,’ notes Spicer. As a consequence, an organisation may appear outwardly successful while slowly sliding towards disaster.

  Consider the Space Shuttle Columbia disaster in 2003, when foam insulation broke off an external tank during launch and struck the left wing of the orbiter. The resulting hole caused the shuttle to disintegrate upon re-entry into the Earth’s atmosphere, leading to the death of all seven crew members.

  The disaster would have been tragic enough had it been a fluke, one-off occurrence without any potential warning signs. But NASA engineers had long known the insulation could break away like this; it had happened in every previous launch. For various reasons, however, the damage had never occurred in the right place to cause a crash, meaning that the NASA staff began to ignore the danger it posed.

  ‘It went from being a troublesome event for engineers and managers to being classified as a housekeeping matter,’ Catherine Tinsley, a professor of management at Georgetown University in Washington DC who has specialised in studying corporate catastrophes, told me.

  Amazingly, similar processes were also the cause of the Challenger crash in 1986, which exploded due to a faulty seal that had deteriorated in the cold Florida winter. Subsequent reports showed that the seals had cracked on many previous missions, but rather than see this as a warning, the staff had come to assume that it would always be safe. As Richard Feynman – a member of the Presidential Commission investigating the disaster – noted, ‘when playing Russian roulette, the fact that the first shot got off safely is little comfort for the next’.13 Yet NASA did not seem to have learnt from those lessons.

  Tinsley emphasises that this isn’t a criticism of those particular engineers and managers. ‘These are really smart people, working with data, and trying really hard to do a good job.’ But NASA’s errors demonstrate just how easily your perception of risk radically shifts without you even recognising that a change has occurred. The organisation was blind to the possibility of disaster.

  The reason appears to be a form of cognitive miserliness known as the outcome bias, which leads us to focus on the actual consequences of a decision without even considering the alternative possible results. Like many of the other cognitive flaws that afflict otherwise intelligent people, it’s really a lack of imagination: we passively accept the most salient detail from an event (what actually happened) and don’t stop to think about what might have been, had the initial circumstances been slightly different.

  Tinsley has now performed many
experiments confirming that the outcome bias is a very common tendency among many different professionals. One study asked business students, NASA employees and space-industry contractors to evaluate the mission controller ‘Chris’, who took charge of an unmanned spacecraft under three different scenarios. In the first, the spacecraft launches perfectly, just as planned. In the second, it has a serious design flaw, but thanks to a turn of luck (its alignment to the sun) it can make its readings effectively. And in the third, there is no such stroke of fortune, and it completely fails.

  Unsurprisingly, the complete failure is judged most harshly, but most of the participants were happy to ignore the design flaw in the ‘near-miss’ scenario, and instead praised Chris’s leadership skills. Importantly – and in line with Tinsley’s theory that the outcome bias can explain disasters like the Columbia catastrophe – the perception of future dangers also diminished after the participants had read about the near miss, explaining how some organisations may slowly become immune to failure.14

  Tinsley has now found that this tendency to overlook errors was the common factor in dozens of other catastrophes. ‘Multiple near-misses preceded and foreshadowed every disaster and business crisis we studied,’ Tinsley’s team concluded in an article for the Harvard Business Review in 2011.15

  Take one of the car manufacturer Toyota’s biggest disasters. In August 2009, a Californian family of four died when the accelerator pedal of their Lexus jammed, leading the driver to lose control on the motorway and plough into an embankment at 120 miles per hour, where the car burst into flames. Toyota had to recall more than six million cars – a disaster that could have been avoided if the company had paid serious attention to more than two thousand reports of accelerator malfunction over the previous decades, which is around five times the number of complaints that a car manufacturer might normally expect to receive for this issue.16

 

‹ Prev