Black Box Thinking
Page 37
9/11 attacks, 111–12, 117
November Oscar incident, 239–49
Novum Organum (Bacon), 134n, 279
nozzle design, Unilever, 125–26, 128, 129, 137, 147, 286
nurse unit administration, and blame, 226–27, 230–31
Obama, Barack, 39
observational statistics, 159, 160, 165
Odean, Terence, 101
Ofshe, Richard, 80
Ogburn, William, 201
Omphalos (Gosse), 42–43
open loops, 14, 116, 180
openness, 229–31, 234–35
optimization, 189–91
Ord, Toby, 148, 149, 177–78
Orland, Ted, 140–41
Osborn, Alex Faickney, 196–97
Otero, Dr Henry, 50
Owen, Jason, 236
P.A.C.E. (Probe, Alert, Challenge, Emergency), 30
Page, Larry, 199
parole, 118–19
Patient Safety Alerts, 49, 50, 51
Pavlov, Ivan, 109
perception, 6, 24–25, 28–29, 30
perfectionism, 16–17, 140–41
perseverance, 262–65
Phillips, Charles, 278
pilot schemes, 290–91
Pixar, 207–10
Plato, 278
Poincaré, Henri, 201, 202
politics/politicians, 141, 283, 284
blame and, 234
Iraq War and, 73–74, 90–94
Popper, Karl, 41, 43–44, 103, 235, 267, 277, 280, 288
Portal, Nicholas, 171
practical knowledge, 212
practice environments, 32–33, 45–46
pre-closed loop behavior, 140
pre-mortems, 291
problem phase of innovation, 195–96, 195–200
professionalism, 12
progress, 7–8
Pronovost, Peter J., 10, 52–53, 103–5, 106–7
prospective hindsight, 291–92
Pruchnicki, Shawn, 26, 31
pseudoscience, 42–44
psychotherapy, 43–44, 46–47, 288–89
Putnam, Hilary, 282
Pythagoras, 278
quantitative easing, 94–96
radio, wind-up, 195
radiologists, 47–48
radiology, 65–66
random allocation, 156n
randomized control trials (RCTs), 154–59, 285, 291
African aid efficacy and, 175–78
Capital One and, 185–86
criminal justice system programs, lack of RCTs for, 158
employment policy and, 187
Google and, 184–85
marginal gains theory and, 175, 176–77
medicine and, 157–58
morality of, 177
of Scared Straight Program efficacy, 160, 162–64
real-time data, 26
Reason, James, 17, 58–59
“Reasonable Choice of Disaster, The” (Lanir), 221
religion, 111–12, 281–82
Renquist, William, 84
resources, 11, 31–32
Ries, Eric, 142–43, 189
Rivera, Juan, 64–65, 70–71, 82–83, 116, 120
Robinson, Alan, 179
Roosevelt, Eleanor, 25
Rosberg, Nico, 183, 184
Royal Aeronautical Society, 26
Royal Navy, 56
Rush, Dr. Benjamin, 13–14
Russia, 108–10
Sachs, Jeffrey, 174
Saddam Hussein, 73, 91
Safe Patients, Smart Hospitals (Pronovost), 53
safety
aviation and, 8–9, 19–20, 24, 26, 38–40
health care and, 49
system, 17, 18, 45
Scalia, Antonin, 84n
scapegoating, 12
Scared Straight! 20 Years Later (documentary), 159
Scared Straight program, 150–54, 159–67
Campbell Corporation’s systematic review of, 164–65
Finnckenauer’s randomized control trial (RCT) of, 160, 162–64
Scheck, Barry, 67, 68, 70, 77, 78, 80, 82, 84, 85, 117
Schulz, Kathryn, 78–79, 81
Schumpeter, Joseph, 130
science, 41–45, 48
ancient Greeks and, 278–79
Bacon’s criticism of medieval, 279–80, 283
failure and, 266
history of, 277–82
Lysenko and, 108–10
method and mindset of, 51–52
scurvy, 56
second victim, 239
selection bias, 161–62
self-esteem, 74, 75–76, 82, 90, 97, 98, 101, 274
self-handicapping, 272–74
self-justification, 18, 87, 88–89, 90, 97–99
and Iraq War decisions, 92–93
Shapiro, Arnold, 153, 166
Shepherd-Barron, John, 196
Shirley, Michael, 69
Shoemaker, Paul, 102
Shoesmith, Sharon, 236, 239
signatures, 11, 18, 24, 52
Simeone, Diego, 274
Simons, Daniel, 117
Sims, Peter, 139–40, 144
Singer, Paul, 95
Skiles, Jeffrey, 38, 39
Slemmer, Mike, 138–40
soccer, 135–36, 253–55, 274–76, 289–90
social hierarchies, as inhibiting assertiveness, 28–29
Social Science and Medical Journal, 89
social tolerance, 285
social workers, 236–38, 239
social world, 283–87
Socrates, 278
software design, 138–40
South Korean ferry disaster, 12
Soyfer, Valery, 109
speed-eating, 187–88
Spelling Bees, 263
Speziale, Angelo, 165–67
sports, 132n, 135–36, 266, 289–90. See also cycling; Formula One; soccer
Staker, Holly, 63, 64, 70, 82–83, 119, 120, 121
Stalin, Joseph, 109
Stanton, Andrew, 210, 212
steam engine, 132
Stern, Sam, 179
Stewart, William Glen, 240, 241–42, 243, 244, 245, 246, 247–49
stigmatization, 40, 97, 105
stock market, 101, 264
Stone, Jeff, 91
stroller, collapsible, 195, 199
structure of systems that learn from failure, 125–49
cumulative selection/adaptation and, 128–29, 130, 292
free markets systems and, 129–31, 284
guided missile approach of success and, 146
lean start-ups and, 141–45
narrative fallacy and, 135–38, 147–49
perfectionism, dangers of, 140–41
software design and, 138–40
technological change and, 131–35
testing and, 128–31
Unilever nozzle and, 125–26, 128, 129, 137, 147, 286
success, 7, 15, 19, 266–67
blind spot created by, 48
failure and, 39–40
Sullenberger, Chesley, 38, 39, 40–41
Sun, 236
Supreme Court, U.S., 84–85
surgery, 3–6, 15–16, 18
Swinmurn, Nick, 143
Syria, 92
systematic review, 164–65
system safety, 17, 18, 45
Taleb, Nassim Nicholas, 44–45, 133, 135
Tavris, Carol, 75, 93
Taylor, John, 95, 96
TD-Gammon, 134–35
Team Sky, 171–73, 179
technolo
gy/technological change, 19, 39, 131–35
bottom-up testing and learning and, 132–34
linear model of, 131–33
theory and, 133–34
Tellis, Gerard J., 205
temporal difference learning, 134–35
testing, 128–49
AIDS/HIV, strategies to combat, 147–49
lean start-ups and, 141–45
narrative fallacy as obstacle to, 135–38, 147–49
perfectionism, dangers of, 140–41
randomized control trials (RCTs) (See randomized control trials (RCTs))
of Scared Straight program efficacy, 160–65
software design and, 138–40
technological change and, 131–35
Tetlock, Philip, 99
theory, 133–34, 212
theory of relativity, 42, 133, 192, 195, 202
thermodynamics, laws of, 132
Think Like a Freak (Kobayashi), 187–88
Thomas, Dorothy, 201
Thompson, W. Leigh, 268
Thomson, Donald, 115
3M, 144
Time, 39, 53
time, perception of, 28–29, 30, 59
Tour de France, 171–73
Toyota Production System (TPS), 48–49, 51, 290
Toy Story (film), 207
Toy Story 2 (film), 207, 208–9
training, 30–31, 47–48
trial by jury, 118, 119
Tyson, Neil deGrasse, 111–12, 113, 114, 117
Uncontrolled (Manzi), 187
Unilever, 125–26, 128, 137, 147
unindicted co-ejaculator theory, 81
United Airlines, 21–25
United Airlines 173, 20, 27–31, 39, 40, 84
United Kingdom
criminal justice system reforms and, 117
health care and, 10, 18, 54–55
math proficiency in, 271
United States of America
DNA testing and, 84
economics and, 94–97, 98
entrepreneurship culture and, 270–71
health care and, 9–10, 17, 32, 49–54, 55–56, 106
math proficiency in, 271
US Airways Flight 1549, 38, 39–40
U.S. Army, 19, 261–63
Vanier, Andre, 138–40
variation, 286
Vesalius, Andreas, 279
Veterans Affairs Medical Center, 16
Virginia Mason Health System, 48, 49–52, 53, 290
Vowles, James, 180–81, 182, 183, 184
Vries, Hugo de, 201
Wald, Abraham, 33–37
Wald, Martin, 33, 34
Wallace, Alfred Russel, 201
Wall Street Journal, 95
war, 278
Ward, Maria, 236
Watt, James, 132
weapons of mass destruction (WMD), 73–74, 74, 91–92, 93, 94
West Point, 261–63
When Prophecy Fails (Festiger), 71n, 72
White Man’s Burden, The (Easterly), 174
Why Smart Executives Fail: And What You Can Learn from Their Mistakes (Finkelstein), 100
Wiggins, Bradley, 172
Wilkinson, Stephan, 242, 245, 248–49
Will and Vision (Tellis and Golder), 205
Wilson, Kevin, 35
Wimbledon High School, 267–68, 269
wind-up radio, 195
Wolff, Toto, 182
World Health Organization, 11
World War II, 33–37
Wright brothers, 199
wrongful convictions, 63–71, 77–85, 114–17
Borchard’s compilation of, 67
Bromgard case, 77–79, 116
cognitive dissonance and, 79–83
DNA evidence and, 68–71, 77, 79–83, 84, 120
drive-bys and, 114
exonerations through DNA testing, 69–70
eyewitness identification and, 114–15
false confessions and, 116
finality doctrines and, 84
hair analysis and, 116
justice system’s initial refusal to learn from, 67–68
as learning opportunity, 65
lineups and, 115–16
memory and, 114
prosecutorial responses to exonerating DNA evidence, 78–83
reform and, 115–17
Rivera case, 63–65, 70–71, 82–83, 116, 119–21
Supreme Court policy of reviewing cases involving procedural errors only, 84–8
Xenophanes, 278
Zappos, 143
*All names of medical staff have been changed to protect anonymity.
*Today the “black” boxes are actually bright orange in color, to improve visibility, and are often combined in a single unit.
*The first proper clinical trial, according to many historians, was conducted by James Lind, a Scottish physician, in 1747. He was trying to find a cure for scurvy and conducted a test on the efficacy of citrus fruit during a long voyage with the East India Company.
*It has been argued by some doctors that it makes sense to cover up mistakes. After all, if patients were to find out about the scale of medical error, they might refuse to accept any treatment at all, which might make the overall situation even worse. But this misses the point. The problem isn’t that patients aren’t finding out about mistakes; it’s that doctors aren’t finding out about them either, and are therefore unable to learn from them. Besides, concealing failure rates from patients undermines their ability to make rational choices; patients have a right to know about the appropriate risks before undergoing treatment.
*Awareness of small errors has vital implications for companies, too. As Amy Edmondson, a professor at Harvard Business School, puts it: “Most large failures have multiple causes, and some of these causes are deeply embedded in organizations . . . Small failures are the early warning signs that are vital to avoiding catastrophic failure in the future.”
*In many circumstances, task-focused behavior is actually an effective way of applying one’s effort. The problem is when this focus comes at the expense of the “bigger picture.” This is when excessive focus undermines performance and, in the case of aviation, safety.
*We can see what this would look like in practice by applying it to a real-world event. This is what Jane, the head nurse, might have said if she had used this approach during the operation of Elaine Bromiley:
PROBE—“Doctor, what other options are you considering if we can’t get the tube in?”
ALERT—“Doctor, oxygen is 40 percent, and is still dropping, the tube is not going in, what about a tracheotomy kit?”
CHALLENGE—“Doctor, we need to conduct a tracheotomy now or we will lose the patient.”
EMERGENCY—“I’m alerting the resuscitation team to do the tracheotomy.”
*“Black box” sometimes has the connotation of an unknown and possibly inscrutable process lying between some input and its result. Here we are using it in the slightly different but related sense of the data recorder in an accident investigation.
*As a Parliamentary Select Committee report in the UK in 2015 put it: “Resources devoted to investigating and learning to improve clinical safety will save unnecessary expense by reducing avoidable harm to patients.”
*The precise relationship between failure and progress in science is a complex topic. There is much debate about when scientists can or should create new theories and paradigms in the light of challenging data. The philosopher Thomas Kuhn has written extensively on this subject. But the basic point that scientific theories should be testable, and therefore vulnerable, is almost universally agreed upon. Self-correction is a central aspect of how science progresses.
>
*In my 2010 book Bounce I explore this area in some detail. In this section, I do not rely on the ideas in Bounce. The point here is merely that extended practice seems to be a prerequisite for expertise in predictable environments.
*The only thing that does change over time is not performance but confidence. In one survey, 25 percent of psychotherapists put themselves in the top 10 percent of performers and none placed themselves below average.
*Daniel Kahneman illustrates this point by inviting us to think about how rapidly we learn to steer a car. The feedback is instant and objective. It takes far longer to learn how to steer a ship, because there are long delays between actions and noticeable outcomes.
*This may also help to explain why mortality and morbidity conferences—recurring meetings among clinicians designed to improve patient care—have not made a significant dent on avoidable mistakes. These are held regularly by medical centers and are supposed to give practitioners an opportunity to learn from mistakes. Clinicians are often nervous about speaking up, or reporting on their colleagues. Perhaps even more important, there is little attempt to probe systemic problems.
*In June 2015, it was reported that as many as 1,000 babies are dying before, during, or after birth each year due to avoidable mistakes in the NHS. One simple error of failing to monitor babies’ heart rates properly accounts for a quarter of negligence payouts.
*In England and Wales, autopsies are ordered whenever the cause of death is officially unknown, or when the death occurred in suspicious circumstances. In 2013, nearly 20 percent of deaths required an autopsy.
*The case material is based on the work of the Innocence Project, interviews with Juan Rivera, Rivera’s lawyers, and Barry Scheck, plus contemporaneous and archive newspaper and media reports, including an e-mail exchange with Andrew Martin, who wrote on the case for the New York Times.
*Her real name was Dorothy Martin but, in order to protect her anonymity, Festinger changed the name in his seminal book When Prophecy Fails.
*Justice Antonin Scalia has gone even further. In a case in 2009, he said: “This Court has never held that the Constitution forbids the execution of a convicted defendant who has had a full and fair trial but is later able to convince a . . . court that he is ‘actually’ innocent.”