Black Box Thinking
Page 33
But it should be clear that this is not just about psychotherapy, it is about intuitive expertise and decision-making in all its manifestations. If we are operating in an environment without meaningful feedback, we can’t improve. We must institutionalize access to the “error signal.”
This is also true of developing expertise in sports. In sports, feedback is almost always instant and obvious. We know when we have hit a ball out of bounds in golf or mistimed a forehand in tennis. But enlightened training environments maximize the quantity and quality of feedback, thus increasing the speed of adaptation.
Take soccer. Every time a player fails to control an incoming pass, he has learned something. Over time the central nervous system adapts, building more finesse and touch. But if a young player practices on a full-size pitch, touching the ball infrequently, he will not improve very fast. On the other hand, if he practices on a smaller pitch, touching the ball frequently, his skill will improve more quickly.
Feedback is relevant to all the skills in soccer, including perceptual awareness, dribbling and passing and the integration of all of these abilities in a real-match context. Great coaches are not interested in merely creating an environment where adaptation can take place, they are focused on the “meta” question of which training system is the most effective. They don’t just want players to improve, but to do so as fast and as profoundly as possible.
In a similar way, in health care, there are debates about whether the Virginia Mason System creates the most effective method of reducing medical errors, just as there are discussions about whether the Toyota Production System is the best way of improving efficiency on a production line. But both models will eventually be superseded. We will learn to create more effective evolutionary systems, not just in health care and manufacturing, but in aviation, too.*
How, then, to select between competing evolutionary systems? A good way is to run a trial. In the case of soccer, for example, you could randomly divide a squad of youngsters with similar ability into two groups, then train them for a few weeks using different drills, then bring them back together and measure who has improved faster. A controlled trial of this kind, provided there is objective measurement, would establish the relative effectiveness of the drills, without the comparison being obscured by all other influences. In other words, the process of selecting between evolutionary systems is itself evolutionary in nature.
Another practical issue when it comes to harnessing the power of failure is to do so while minimizing the costs. One way to achieve this for corporations and governments is with pilot schemes. These provide an opportunity to learn on a small scale. But it is vital that pilots are designed to test assumptions rather than confirm them. If you populate a pilot with your best staff in a prized location, you will learn virtually nothing about the challenges that are likely to occur.
As Amy Edmondson of Harvard Business School puts it:
Managers in charge of piloting a new product or service . . . typically do whatever they can to make sure that the pilot is perfect right out of the starting gate. Ironically, this hunger to succeed can later inhibit the success of the official launch. Too often, managers in charge of pilots design optimal conditions rather than representative ones. Thus the pilot doesn’t produce knowledge about what won’t work.
Another powerful method we have looked at is randomized control trials. These are growing in the corporate world, but remain unexploited in many areas such as politics. The Behavioural Insights Team (BIT), a small organization that started life inside Number 10 Downing Street and is now a social purpose company, was set up in 2010 to address this problem. It has already conducted more RCTs than the rest of the UK government combined in its entire history (sadly, this isn’t saying much).
At a couple of meetings at their offices in central London, the team talked through some of these trials, not just in the UK but beyond. In one they tested different styles of letter (different wording, and so on) sent to Guatemalan taxpayers who had failed to declare their income tax on time. The most effective design increased payment by an astonishing 43 percent. This is the power of testing to see what works and what doesn’t. “There is still a great deal of political resistance to running trials, in the UK and beyond,” David Halpern, the chief executive of BIT, said, “but we are slowly making progress.”
Another “failure based” technique, which has come into vogue in recent years, is the so-called pre-mortem. With this method a team is invited to consider why a plan has gone wrong before it has even been put into action. It is the ultimate “fail fast” technique. The idea is to encourage people to be open about their concerns, rather than hiding them out of fear of sounding negative.
The pre-mortem is crucially different from considering what might go wrong. With a pre-mortem, the team is told, in effect, that “the patient is dead”: the project has failed; the objectives have not been met; the plans have bombed. Team members are then asked to generate plausible reasons why. By making the failure concrete rather than abstract, it alters the way the mind thinks about the problem.
According to the celebrated psychologist, Gary Klein, “prospective hindsight,” as it is called, increases the ability of people to correctly identify reasons for future outcomes by 30 percent. It has also been backed by a host of leading thinkers, including Daniel Kahneman. “The pre-mortem is a great idea,” he said. “I mentioned it at Davos . . . and the chairman of a large corporation said it was worth coming to Davos for.”14
A pre-mortem typically starts with the leader asking everyone in the team to imagine that the project has gone horribly wrong and to write down the reasons why on a piece of paper. He or she then asks everyone to read a single reason from the list, starting with the project manager, before going around the table again.
Klein cites examples where issues have surfaced that would otherwise have remained buried. “In a session held at one Fortune 50–size company, an executive suggested that a billion-dollar environmental sustainability project had ‘failed’ because interest waned when the CEO retired,” he writes. “Another pinned the failure on a dilution of the business case after a government agency revised its policies.”15
The purpose of the pre-mortem is not to kill off plans, but to strengthen them. It is also very easy to conduct. “My guess is that, in general, doing a pre-mortem on a plan that is about to be adopted won’t cause it to be abandoned,” Kahneman has said. “But it will probably be tweaked in ways that everybody will recognize as beneficial. So the pre-mortem is a low-cost, high-pay-off kind of thing.”
Throughout the book we have looked at other techniques such as marginal gains and the lean start-up. But the point about all these methods is that they harness the incalculable potency of the evolutionary mechanism. Providing they are used with an eye to context, and are fused with a growth-orientated mindset, they set the stage for an endlessly powerful process: cumulative adaptation.
IV
On a clear afternoon in early spring, I visited Martin Bromiley, the pilot whose story opened this book. He lost his wife, Elaine, during a routine operation in 2005. His two children, Adam and Victoria, were four and five at the time. At the time of this writing, they are fourteen and fifteen.
North Marston is a classically beautiful English village. In the center is a small pub called the Pilgrim. Rolling hills and green meadows surround a small, tight-knit community with a population of around eight hundred people. The sun was shining as I drove through the quiet lanes to the Bromiley family home.
As we sat in his living room, Martin talked about his ongoing campaign to champion patient safety. Slight, quietly spoken, but determined, he continues to lead the Clinical Human Factors Group as an unpaid volunteer, and spends much of his free time encouraging the adoption of a mindset that regards adverse events not as threats but as learning opportunities.
A couple of weeks before our meeting, Martin had sent out a tweet to gauge
what the campaign had achieved. His question was characteristically simple and to the point. “Question—can you give me some specific examples of the impact of learning from my late wife’s death? How has it changed things?” he wrote.
Within minutes, responses started flowing in, not just from the UK but around the world. Mark, a consultant in respiratory and intensive care medicine in Swindon, wrote: “It has been one of the drivers for increasing simulation training. This is having a big impact on improving quality of care.”
Nick, who works in medical safety, wrote: “We use your story at both undergraduate and postgraduate to discuss situational awareness and hierarchy/ raising concerns.” Jo Thomas, a nurse and senior lecturer in paramedic science, wrote: “Your strength is reaching clinicians far beyond the operating and anesthetic/ recovery rooms. [It has] challenged assumptions.”
Geoff Healy, an anesthetist from Sydney, Australia, wrote: “Your strength and courage has educated at least two if not three or more generations of anesthetists. The lives saved or altered because of your work are incalculable. We refer to this event everyday.”
These answers articulate the truth that hopefully underpins this book. Learning from failure may have the sound of a management cliché. It may be trotted out as a truism or a mantra lacking traction. But the quiet work of Martin Bromiley should help us to glimpse a wider vista. Learning from failure expresses a profound moral purpose. It is about saving, sustaining and enhancing human life. Martin said:
There has undoubtedly been progress in many areas of health care. Ten years ago, hospital-acquired infections like MRSA were dismissed as “one of those things.” They were considered an inevitable problem that we couldn’t do much about. Today, there is a real desire to confront these types of problems and figure out how to prevent harm in the future.
But that mindset is by no means universal. You only have to look at the sheer scale of preventable deaths, both in the UK and around the world, to see that there is still a profound tendency to cover up mistakes, and a fear about what independent investigations might uncover. We need to flip this attitude 180 degrees. It is the single most important issue in health care.
As the sun began to set over the horizon, the front door swung open: Adam and Victoria had returned from school. It happened to be Adam’s fourteenth birthday, and he spoke with excitement about going out for pizza that evening. I asked them what they were hoping to do with their lives. Victoria answered instantly and emphatically: “I want to be a pilot,” she said. Adam expressed an interest in aviation, too, but leans toward meteorology.
We started to talk about the work that their father is doing to change attitudes in health care. “I am really proud of Dad,” Adam said. “He puts so much time into the group, even though he has a full-time job. If you had told him ten years ago that he would make such a big difference, he wouldn’t have believed it. He gets letters and messages almost every week.”
Victoria, sitting alongside him, nodded. “Our mother’s death was very hard for all of us and we know that nothing can bring her back,” she said, her face etched with emotion. “But I hope Dad continues with his work, and helps to spare other families from what we have had to go through.”
Victoria paused for a moment, and then her face brightened. “I think Mum would have liked that,” she said.
Acknowledgments
I have failed quite a lot in life, particularly in my old sport of table tennis, so the subject matter of this book is close to my heart. The idea for it was triggered by a growing realization that the common theme linking successful people, organizations, and systems is a healthy and empowering attitude to failure. This is as true of David Beckham and James Dyson as it is of the aviation industry and Google.
The book has gone through a number of iterations, hopefully finding marginal gains with each change, which mirrors the argument within the pages about how improvement happens. Most of these iterations were inspired by the suggestions of friends and colleagues who read early drafts. I am hugely grateful to Danny Finkelstein, David Papineau, Chris Dillow, Max Reuter, Ben Preston, Andy Kidd, Kathy Weeks, Carl Macrae, Mark Thomas, Dilys Syed, David Honigman, and James Naylor. Any defects that remain are mine and mine alone, although I hope I can learn from them, too.
I would also like to thank the brilliant Nick Davies, who edited the book in the UK, Emily Angell, who edited the U.S. edition, and Jonny Geller, my agent, who always fizzes with ideas and enthusiasm. I have also had terrific support from colleagues at The Times, including Tim Hallissey, Nicola Jeal and John Witherow. The Times is a wonderful publication to work for.
One of the most enjoyable things about writing a book of this kind is coming into contact with eye-opening books, papers and journal articles. I have tried to reference all of these in the endnotes, which provide further reading for those who wish to delve a little deeper, but I would like to acknowledge, here, some of the books that influenced me the most. These include a number of works by Karl Popper: The Logic of Scientific Discovery; Conjectures and Refutations; The Open Society and its Enemies; The Poverty of Historicism; and Unended Quest. I have also much enjoyed, and learned from, The Structure of Scientific Revolutions by Thomas Kuhn and Against Method by Paul Feyerabend.
There are some marvelous popular books that have influenced the argument, too. These include Just Culture by Sidney Dekker, Safe Patients, Smart Hospitals by Peter Pronovost, Human Error by James Reason, Being Wrong by Kathryn Schultz, Adapt by Tim Harford, Antifragile by Nassim Nicholas Taleb, Complications by Atul Gawande, Mistakes Were Made (but Not by Me!) by Carol Tavris and Elliot Aronson, Uncontrolled by Jim Manzi, Teaming by Amy Edmondson, Where Good Ideas Come From by Steven Johnson, Creativity, Inc. by Ed Catmull, Self Theories by Carol Dweck, The Decisive Moment by Jonah Lehrer, and Philosophy and the Real World by Bryan Magee.
I would also like to thank all of those who agreed to be interviewed, or who have read particular chapters, or helped in other ways. Many are mentioned within these pages, but I would like to separately acknowledge James Dyson, Owain Hughes, David Halpern and the Behavioural Insights Team, Jim Manzi, David Bentley, Carol Dweck, Robert Dodds, Sidney Dekker, Steve Art, Meghan Mahoney, the wonderful people at Mercedes F1 and Team Sky, Toby Ord, Mark McCarthy, Tony McHale, Rita Weeks, David Beckham, Steve Jones and Esther Duflo.
Most of all, I would like to thank Kathy, my amazing wife, Evie and Teddy, our children, and Abbas and Dilys, my parents. This is for you.
Notes
CHAPTER 1: A ROUTINE OPERATION
1. Material on Elaine Bromiley’s operation based on interviews with Martin, Victoria and Adam Bromiley, the independent report by Dr. Michael Harmer, and other supporting documents.
2. Daniel Coyle, The Talent Code: Greatness Isn’t Born. It’s Grown. Here’s How (New York: Random House, 2009).
3. http://www.iata.org/publications/Documents/iata-safety-report-2013.pdf.
4. http://www.iata.org/pressroom/pr/Pages/2015-03-09-01.aspx.
5. Members of the IATA. http://www.iata.org/pressroom/facts_figures/fact_sheets/Documents/safety-fact-sheet.pdf.
6. “To Err Is Human,” by the Institute of Medicine: https://www.iom.edu/~/media/Files/Report%20Files/1999/To-Err-is-Human/To%20Err%20is%20Human%201999%20%20report%20brief.pdf.
7. Peter I. Buerhaus, “Lucian Leape on the Causes and Prevention of Errors and Adverse Events in Health Care,” Journal of Nursing Scholarship, June 2007.
8. http://journals.lww.com/journalpatientsafety/Fulltext/2013/09000/A_New,_Evidence_based_Estimate_of_Patient_Harms.2.aspx.
9. http://www.c-span.org/video/?320495-1/hearing-patient-safety.
10. Joe Graedon and Teresa Graedon, Top Screwups Doctors Make and How to Avoid Them (New York: Harmony, 2012).
11. http://www.c-span.org/video/?320495-1/hearing-patient-safety.
12. Ibid.
13. “A Safer Place for Patients: Learning to Improve
Patient Safety,” National Audit Office report, November 3, 2005.
14. Atul Gawande, Complications: A Surgeon’s Notes on an Imperfect Science (London: Profile, 2008).
15. http://www.who.int/classifications/help/icdfaq/en/.
16. CBS News story, April 21, 2014, http://www.cbsnews.com/news/ferry-captains-acts-murderous-south-korean-president/.
17. Sidney Dekker, lecture in Brisbane: https://vimeo.com/102167635.
18. Gerry Greenstone, “The History of Bloodletting,” British Columbia Medical Journal, January 2010.
19. Nancy Berlinger, After Harm: Medical Error and the Ethics of Forgiveness (Baltimore: Johns Hopkins University Press, 2007).
20. Compared with similar centers nearby, it had lower compensation claims than all but seven of its competitors. But see also Dr. David Studdert et al., “Disclosure of Medical Injury to Patients” in Health Affairs 26, no. 1 (2007): 215–26.
21. C. A. Vincent, M. Young, and A. Phillips, “Why Do People Sue Doctors? A Study of Patients and Relatives Taking Legal Action,” Lancet 343, no. 8913 (1994): 1609–13.
22. http://www.ncbi.nlm.nih.gov/pubmed/18981794.
23. David Hilfiker, “Facing Our Mistakes,” New England Journal of Medicine 310, no. 2 (1984): 118–22.
24. James Reason, A Life in Error: From Little Slips to Big Disasters (Burlington, VT: Ashgate, 2013).
25. Rae M. Lamb, “Hospital Disclosure Practices: Results of a National Survey,” Health Affairs 22, no. 2 (2003): 73–83.
26. http://www.chron.com/news/article/Detective-work-required-to-uncover-errors-1709000.php.
27. J. L. Vincent, “Information in the ICU: Are We Being Honest with Patients? The Results of a European Questionnaire,” Intensive Care Medicine 24, no. 12 (1998): 1251–56.