Book Read Free

The Intelligence Trap

Page 15

by David Robson


  Could a simple change of thinking style help save some of those lives? To find out, I met Silvia Mamede in the hubbub of Rotterdam’s Erasmus Medical Centre. Mamede moved to the Netherlands from Ceará, Brazil, more than a decade ago, and she immediately offers me a strong cup of coffee – ‘not the watery stuff you normally get here’ – before sitting opposite me with a notebook in hand. ‘You organise your ideas better if you have a pencil and paper,’ she explained. (Psychological research does indeed suggest that your memory often functions better if you are allowed to doodle as you talk.45)

  Her aim is to teach doctors to be similarly reflective concerning the ways they make their decision making. Like the medical checklist, which the doctor and writer Atul Gawande has shown to be so powerful for preventing memory failures during surgery, the concept is superficially simple: to pause, think and question your assumptions. Early attempts to engage ‘system 2’ thinking had been disappointing, however; doctors told to use pure analysis, in place of intuition – by immediately listing all the alternative hypotheses, for instance – often performed worse than those who had taken a less deliberative, more intuitive approach.46

  In light of the somatic marker hypothesis, this makes sense. If you ask someone to reflect too early, they fail to draw on their experience, and may become overly focused on inconsequential information. You are blocking them from using their emotional compass, and so they become a little like Damasio’s brain injury patients, stuck in their ‘analysis paralysis’. You can’t just use system 1 or system 2 – you need to use both.

  For this reason, Mamede suggests that doctors note down their gut reaction as quickly as possible; and only then should they analyse the evidence for their gut reaction and compare it to alternative hypotheses. Sure enough, she has since found that doctors can improve their diagnostic accuracy by up to 40 per cent by taking this simple approach – a huge achievement for such a small measure. Simply telling doctors to revisit their initial hypothesis – without any detailed instructions on re-examining the data or generating new ideas – managed to boost accuracy by 10 per cent, which again is a significant improvement for little extra effort.

  Importantly, and in line with the broader research on emotion, this reflective reasoning also reduces the ‘affective biases’ that can sway a doctor’s intuition. ‘There are all these factors that could disturb “System 1” – the patient’s appearance, whether they are rich or poor, the time pressure, whether they interrupt you,’ she said. ‘But the hope is that reflective reasoning can make the physician take a step back.’

  To explore one such factor, Mamede recently tested how doctors respond to ‘difficult’ patients, such as those who rudely question the professionals’ decisions. Rather than observing real encounters, which would be difficult to measure objectively, Mamede offered fictional vignettes to a group of general practitioners (family doctors). The text mostly outlined their symptoms and test results, but it also included a couple of sentences detailing their behaviour.

  Many of the doctors did not even report noticing the contextual information, while others were perplexed at the reasons they had been given these extra details. ‘They said, “But this doesn’t matter! We are trained to look past that, to not look at the behaviour. This should make no difference,” ’ Mamede told me. In fact, as the research on emotion would suggest, it had a huge impact. For more complex cases, the general practitioners were 42 per cent more likely to make a diagnostic error for the difficult patients.47

  If the doctors were told to engage in the more reflective procedure, however, they were more likely to look past their frustration and give a correct diagnosis. It seems that the pause in their thinking allowed them to gauge their own emotions and correct for their frustration, just as the theories of emotion differentiation and regulation would predict.

  Mamede has also examined the availability bias, causing doctors to over-diagnose an illness if it has recently appeared in the media and is already on their mind. Again, she has shown that the more reflective procedure eliminates the error – even though she offered no specific instructions or explanations warning them of that particular bias.48 ‘It’s amazing, when you see the graphs of these studies. The doctors who weren’t exposed to the reports of disease had an accuracy of 71 per cent, and the biased ones only had an accuracy of 50 per cent. And then, when they reflected they went back to the 70 per cent,’ she told me. ‘So it completely corrected for the bias.’

  These are astonishing results for such small interventions, but they all show us the power of greater self-awareness, when we allow ourselves to think more reflectively about our intuitions.

  Some doctors may resist Mamede’s suggestions; the very idea that, after all their training, something so simple could correct their mistakes is bruising to the ego, particularly when many take enormous pride in the power of their rapid intuition. At conferences, for instance, she will present a case on the projector and wait for the doctors to give a diagnosis. ‘It’s sometimes twenty seconds – they just read four or five lines and they say ‘‘appendicitis’’,’ she told me. ‘There is even this joke saying that if the doctor needs to think, leave the room.’

  But there is now a growing momentum throughout medicine to incorporate the latest psychological findings into the physician’s daily practice. Pat Croskerry at Dalhousie University in Canada is leading a critical thinking programme for doctors – and much of his advice echoes the research we have explored in this chapter, including, for instance, the use of mindfulness to identify the emotional sources of our decision, and, when errors have occurred, the employment of a ‘cognitive and affective autopsy’ to identify the reasons that their intuition backfired. He also advocates ‘cognitive inoculation’ – using case-studies to identify the potential sources of bias, which should mean that the doctors are more mindful of the factors influencing their thinking.

  Croskerry is still collecting the data from his courses to see the long-term effects on diagnostic accuracy. But if these methods can prevent just a small portion of those 40,000?80,000 deaths per year, they will have contributed more than a major new drug.49

  Although medicine is leading the way, a few other professions are also coming around to this way of thinking. The legal system, for instance, is notoriously plagued by bias – and in response to this research, the American Judges Association has now issued a white paper that advocated mindfulness as one of the key strategies to improve judicial decision making, while also advising each judge to take a moment to ‘read the dials’ and interrogate their feelings in detail, just as neuroscientists and psychologists such as Feldman Barrett are suggesting.50

  Ultimately, these findings could change our understanding of what it means to be an expert.

  In the past, psychologists had described four distinct stages in the learning curve. The complete beginner is unconsciously incompetent – she does not even know what she doesn’t know (potentially leading to the over-confidence of the Dunning?Kruger effect we saw in Chapter 3). After a short while, however, she will understand the skills she lacks, and what she must do to learn them; she is consciously incompetent. With effort, she can eventually become consciously competent – she can solve most problems, but she has to think a lot about the decisions she is making. Finally, after years of training and on-the-job experience, those decisions become second nature – she has reached unconscious competence. This was traditionally the pinnacle of expertise, but as we have seen, she may then hit a kind of ‘ceiling’ where her accuracy plateaus as a result of the expert biases we explored in Chapter 3.51 To break through that ceiling, we may need one final stage – ‘reflective competence’ – which describes the capacity to explore our feelings and intuitions, and to identify biases before they cause harm.52

  As Ray Kroc had found in that Californian diner, intuition can be a powerful thing – but only once we know how to read those funny-bone feelings.

  6

  A bullshit detection kit: How to recognise lies
and misinformation

  If you were online at the turn of the new millennium, you may remember reading about the myth of the ‘flesh-eating bananas’.

  In late 1999, a chain email began to spread across the internet, reporting that fruit imported from Central America could infect people with ‘necrotising fasciitis’ – a rare disease in which the skin erupts into livid purple boils before disintegrating and peeling away from muscle and bone. The email stated that:

  Recently this disease has decimated the monkey population in Costa Rica . . . It is advised not to purchase bananas for the next three weeks as this is the period of time for which bananas that have been shipped to the US with the possibility of carrying this disease. If you have eaten a banana in the last 2–3 days and come down with a fever followed by a skin infection seek MEDICAL ATTENTION!!!

  The skin infection from necrotizing fasciitis is very painful and eats two to three centimeters of flesh per hour. Amputation is likely, death is possible. If you are more than an hour from a medical center burning the flesh ahead of the infected area is advised to help slow the spread of the infection. The FDA has been reluctant to issue a country wide warning because of fear of a nationwide panic. They have secretly admitted that they feel upwards of 15,000 Americans will be affected by this but that these are ‘acceptable numbers’. Please forward this to as many of the people you care about as possible as we do not feel 15,000 people is an acceptable number.

  By 28 January 2000, public concern was great enough for the US Centers for Disease Control and Prevention to issue a statement denying the risks. But their response only poured fuel on the flames, as people forgot the correction but remembered the scary, vivid idea of the flesh-eating bananas. Some of the chain emails even started citing the CDC as the source of the rumours, giving them greater credibility.

  Within weeks, the CDC was hearing from so many distressed callers that it was forced to set up a banana hotline, and it was only by the end of the year that the panic burned itself out as the feared epidemic failed to materialise.1

  The necrotising-fasciitis emails may have been one of the first internet memes – but misinformation is not a new phenomenon. As the eighteenth-century writer Jonathan Swift wrote in an essay on the rapid spread of political lies: ‘Falsehood flies and the truth comes limping after it’.

  Today, so-called ‘fake news’ is more prevalent than ever. One survey in 2016 found that more than 50 per cent of the most shared medical stories on Facebook had been debunked by doctors, including the claim that ‘dandelion weed can boost your immune system and cure cancer’ and reports that the HPV vaccine increased your risk of developing cancer.2

  The phenomenon is by no means restricted to the West – though the particular medium may depend on the country. In India, for instance, false rumours spread like wildfire through WhatsApp across its 300 million smartphones – covering everything from local salt shortages to political propaganda and wrongful allegations of mass kidnappings. In 2018, these rumours even triggered a spate of lynchings.3

  You would hope that traditional education could protect us from these lies. As the great American philosopher John Dewey wrote in the early twentieth century: ‘If our schools turn out their pupils in that attitude of mind which is conducive to good judgment in any department of affairs in which the pupils are placed, they have done more than if they sent out their pupils merely possessed of vast stores of information, or high degrees of skill in specialized branches.’4

  Unfortunately, the work on dysrationalia shows us this is far from being the case. While university graduates are less likely than average to believe in political conspiracy theories, they are slightly more susceptible to misinformation about medicine, believing that pharmaceutical companies are withholding cancer drugs for profit or that doctors are hiding the fact that vaccines cause illnesses, for instance.5 They are also more likely to use unproven, complementary medicines.6

  It is telling that one of the first people to introduce the flesh-eating banana scare to Canada was Arlette Mendicino, who worked at the University of Ottawa’s medical faculty ? someone who should have been more sceptical.7 ‘I thought about my family, I thought about my friends. I had good intentions,’ she told CBC News after she found out she’d been fooled. Within a few days, the message had spread across the country.

  In our initial discussion of the intelligence trap, we explored the reasons why having a higher IQ might make you ignore contradictory information, so that you are even more tenacious in your existing beliefs, but this didn’t really explain why someone like Mendicino could be so gullible in the first place. Clearly this involves yet more reasoning skills that aren’t included in the traditional definitions of general intelligence, but that are essential if we want to become immune to these kinds of lies and rumours.

  The good news is that certain critical thinking techniques can protect us from being duped, but to learn how to apply them, we first need to understand how certain forms of misinformation are deliberately designed to escape deliberation and why the traditional attempts to correct them often backfire so spectacularly. This new understanding not only teaches us how to avoid being duped ourselves; it is also changing the way that many global organisations respond to unfounded rumours.

  Before we continue, first consider the following statements and say which is true and which is false in each pairing:

  Bees cannot remember left from right

  And

  Cracking your knuckles can cause arthritis

  And now consider the following opinions, and say which rings true for you:

  Woes unite foes

  Strife bonds enemies

  And consider which of these online sellers you would shop with:

  rifo073 Average user rating: 3.2

  edlokaq8 Average user rating: 3.6

  We’ll explore your responses in a few pages, but reading the pairs of statements you might have had a hunch that one was truthful or more trustworthy than the other. And the reasons why are helping scientists to understand the concept of ‘truthiness’.

  The term was first popularised by the American comedian Stephen Colbert in 2005 to describe the ‘truth that comes from the gut, not from the book’ as a reaction to George W. Bush’s decision making and the public perception of his thinking. But it soon became clear that the concept could be applied to many situations8 and it has now sparked serious scientific research.

  Norbert Schwarz and Eryn Newman have led much of this work, and to find out more, I visited them in their lab at the University of Southern California in Los Angeles. Schwarz happens to have been one of the leaders in the new science of emotional decision making that we touched on in the last chapter, showing, for instance, the way the weather sways our judgement of apparently objective choices. The work on truthiness extends this idea to examine how we intuitively judge the merits of new information.

  According to Schwarz and Newman, truthiness comes from two particular feelings: familiarity (whether we feel that we have heard something like it before) and fluency (how easy a statement is to process). Importantly, most people are not even aware that these two subtle feelings are influencing their judgement – yet they can nevertheless move us to believe a statement without questioning its underlying premises or noting its logical inconsistencies.

  As a simple example, consider the following question from some of Schwarz’s earlier studies of the subject:

  How many animals of each kind did Moses take on the Ark?

  The correct answer is, of course, zero. Moses didn’t have an ark ? it was Noah who weathered the flood. Yet even when assessing highly intelligent students at a top university, Schwarz has found that just 12 per cent of people register that fact.9

  The problem is that the question’s phrasing fits into our basic conceptual understanding of the Bible, meaning we are distracted by the red herring – the quantity of animals – rather than focusing on the name of the person involved. ‘It’s some old guy who had something to do w
ith the Bible, so the whole gist is OK,’ Schwarz told me. The question turns us into a cognitive miser, in other words – and even the smart university students in Schwarz’s study didn’t notice the fallacy.

  Like many of the feelings fuelling our intuitions, fluency and familiarity can be accurate signals. It would be too exhausting to examine everything in extreme detail, particularly if it’s old news; and if we’ve heard something a few times, that would suggest that it’s a consensus opinion, which may be more likely to be true. Furthermore, things that seem superficially straightforward often are exactly that; there’s no hidden motive. So it makes sense to trust things that feel fluent.

  What’s shocking is just how easy it is to manipulate these two cues with simple changes to presentation so that we miss crucial details.

  In one iconic experiment, Schwarz found that people are more likely to fall for the Moses illusion if that statement is written in a pleasant, easy-to-read font – making the reading more fluent – compared to an uglier, italic script that is harder to process. For similar reasons, we are also more likely to believe people talking in a recognisable accent, compared to someone whose speech is harder to understand, and we place our trust in online vendors with easier-to-pronounce names, irrespective of their individual ratings and reviews by other members. Even a simple rhyme can boost the ‘truthiness’ of a statement, since the resonating sounds of the words makes it easier for the brain to process.10

  Were you influenced by any of these factors in those questions at the start of this chapter? For the record, bees really can be trained to distinguish Impressionist and Cubist painters (and they do also seem to distinguish left from right); coffee can reduce your risk of diabetes, while cracking your knuckles does not appear to cause arthritis.11 But if you are like most people, you may have been swayed by the subtle differences in the way the statements were presented – with the fainter, grey ink and ugly fonts making the true statements harder to read, and less “truthy” as a result. And although they mean exactly the same thing, you are more likely to endorse ‘woes unite foes’ than ‘strife bonds enemies’ – simply because it rhymes.

 

‹ Prev