The Great Mental Models

Home > Other > The Great Mental Models > Page 12
The Great Mental Models Page 12

by Shane Parrish


  And so, dark matter remains, right now, the simplest explanation for the peculiar behavior of galaxies. Scientists, however, continue to try to conclusively discover dark matter and thus try to determine if our understanding of the world is correct. If dark matter eventually becomes too complicated an explanation, it could be that the data describes something we don’t yet understand about the universe. We can then apply Occam’s Razor to update to what is the simplest, and thus easiest to verify, explanation. Vera Rubin herself, after noting that scientists always felt like they were ten years away from discovering dark matter without ever closing the gap, was described in an interview as thinking, “The longer that dark matter went undetected, … the more likely she thought the solution to the mystery would be a modification to our understanding of gravity.”11 This claim, demanding a total overhaul of our established theories of gravity, would correspondingly require extraordinary proof!

  Simplicity can increase efficiency

  With limited time and resources, it is not possible to track down every theory with a plausible explanation of a complex, uncertain event. Without the filter of Occam’s Razor, we are stuck chasing down dead ends. We waste time, resources, and energy.

  The great thing about simplicity is that it can be so powerful. Sometimes unnecessary complexity just papers over the systemic flaws that will eventually choke us. Opting for the simple helps us make decisions based on how things really are. Here are two short examples of those who got waylaid chasing down complicated solutions when simple ones were most effective.

  The ten-acre Ivanhoe Reservoir in Los Angeles provides drinking water for over 600,000 people. Its nearly 60 million gallons of water are disinfected with chlorine, as is common practice.12 Ground water often contains elevated levels of a chemical called bromide. When chlorine and bromide mix, then are exposed to sunlight, they create a dangerous carcinogen called bromate.

  In order to avoid poisoning the water supply, the L.A. Department of Water and Power (DWP) needed a way to shade the water’s surface. Brainstorming sessions had yielded only two infeasible solutions, building either a ten-acre tarp or a huge retractable dome over the reservoir. Then a DWP biologist suggested using “bird balls,” the floating balls that airports use to keep birds from congregating near runways. They require no construction, no parts, no labor, no maintenance, and cost US$0.40 each. Three million UV-deflecting black balls were then deployed in Ivanhoe and other LA reservoirs, a simple solution to a potentially serious problem.

  Occam’s Razor in the Medical Field

  Occam’s Razor can be quite powerful in the medical field, for both doctors and patients. Let’s suppose that a patient shows up at a doctor’s office with horrible flu-like symptoms. Are they more likely to have the flu or have contracted Ebola?

  This is a problem best solved by a concept we explored in the chapter on Probabilistic Thinking, called Bayesian Updating. It’s a way of using general background knowledge in solving specific problems with new information. We know that generally the flu is far more common than Ebola, so when a good doctor encounters a patient with what looks like the flu, the simplest explanation is almost certainly the correct one. A diagnosis of Ebola means a call to the Center for Disease Control and a quarantine—an expensive and panic-inducing mistake if the patient just has the flu. Thus, medical students are taught to heed the saying, “When you hear hoofbeats, think horses, not zebras.”

  And for patients, Occam’s Razor is a good counter to hypochondria. Based on the same principles, you factor in the current state of your health to an evaluation of your current symptoms. Knowing that the simplest explanation is most likely to be true can help us avoid unnecessary panic and stress.

  In another life-and-death situation, in 1989 Bengal tigers killed about 60 villagers from India’s Ganges delta.13 No weapons seemed to work against them, including lacing dummies with live wires to shock the tigers away from human populations.

  Then a student at the Science Club of Calcutta noticed that tigers only attacked when they thought they were unseen, and recalled that the patterns decorating some species of butterflies, beetles, and caterpillars look like big eyes, ostensibly to trick predators into thinking their prey was also watching them. The result: a human face mask, worn on the back of head. Remarkably, no one wearing a mask was attacked by a tiger for the next three years; anyone killed by tigers during that time had either refused to wear the mask, or had taken it off while working. — Sidebar: Occam’s Razor in the Medical Field

  A few caveats

  One important counter to Occam’s Razor is the difficult truth that some things are simply not that simple. The regular recurrence of fraudulent human organizations like pyramid schemes and Ponzi schemes is not a miracle, but neither is it obvious. No simple explanation suffices, exactly. They are a result of a complex set of behaviors, some happening almost by accident or luck, and some carefully designed with the intent to deceive. It isn’t a bit easy to spot the development of a fraud. If it was, they’d be stamped out early. Yet, to this day, frauds frequently grow to epic proportions before they are discovered.

  Alternatively, consider the achievement of human flight. It, too, might seem like a miracle to our 17th century friar, but it isn’t—it’s a natural consequence of applied physics. Still, it took a long time for humans to figure out because it’s not simple at all. In fact, the invention of powered human flight is highly counterintuitive, requiring an understanding of airflow, lift, drag, and combustion, among other difficult concepts. Only a precise combination of the right factors will do. You can’t just know enough to get the aircraft off the ground, you need to keep it in the air!

  The Razor in Leadership

  When Louis Gerstner took over IBM in the early 1990s, during one of the worst periods of struggle in its history, many business pundits called for a statement of his vision. What rabbit would Gerstner pull out of his hat to save Big Blue?

  It seemed a logical enough demand—wouldn’t a technology company that had fallen behind need a grand vision of brilliant technological leadership to regain its place among the leaders of American innovation? As Gerstner put it, “The IBM organization, so full of brilliant, insightful people, would have loved to receive a bold recipe for success—the more sophisticated, the more complicated the recipe, the better everyone would have liked it.”

  Smartly, Gerstner realized that the simple approach was most likely to be the effective one. His famous reply was that “the last thing IBM needs right now is a vision.” What IBM actually needed to do was to serve its customers, compete for business in the here and now, and focus on businesses that were already profitable. It needed simple, tough-minded business execution.

  By the end of the 1990s, Gerstner had provided exactly that, bringing IBM back from the brink without any brilliant visions or massive technological overhauls.

  _

  Gerstner, Louis V.

  Who Says Elephants Can’t Dance? Leading a Great Enterprise Through Dramatic Change. New York: HarperCollins, 2003.

  Simple as we wish things were, irreducible complexity, like simplicity, is a part of our reality. Therefore, we can’t use this Razor to create artificial simplicity. If something cannot be broken down any further, we must deal with it as it is.

  How do you know something is as simple as it can be? Think of computer code. Code can sometimes be excessively complex. In trying to simplify it, we would still have to make sure it can perform the functions we need it to. This is one way to understand simplicity. An explanation can be simplified only to the extent that it can still provide an accurate understanding.

  Conclusion

  Of course, focusing on simplicity when all others are focused on complexity is a hallmark of genius, and it’s easier said than done. But always remembering that a simpler explanation is more likely to be correct than a complicated one goes a long way towards helping us conserve our most precious resources of time and energy. — Sidebar: The Razor in Leadership

&nb
sp; I need to listen well so that I hear what is not said.

  Thuli Madonsela1

  The People Who Appear in this Chapter

  Honorius.

  384-423 - Western Roman Emperor for 30 years. His reign was chaotic and messy, and saw Rome being sacked for the first time in almost 800 years.

  Stilicho.

  359-408 - High ranking general in the Roman army. Half Vandal, his regency for Honorius marked the high point of Germanic advancement in the service of Rome.

  Arkhipov, Vasili.

  1926-1998 - Russian. Retired as a Vice-Admiral in the Soviet Navy.

  In 1961 he was deputy commander of K-19. The events on board inspired the Harrison Ford movie, K-19: The Widowmaker.

  Hanlon’s Razor

  Hard to trace in its origin, Hanlon’s Razor states that we should not attribute to malice that which is more easily explained by stupidity. In a complex world, using this model helps us avoid paranoia and ideology. By not generally assuming that bad results are the fault of a bad actor, we look for options instead of missing opportunities. This model reminds us that people do make mistakes. It demands that we ask if there is another reasonable explanation for the events that have occurred. The explanation most likely to be right is the one that contains the least amount of intent.

  Assuming the worst intent crops up all over our lives. Consider road rage, a growing problem in a world that is becoming short on patience and time. When someone cuts you off, to assume malice is to assume the other person has done a lot of risky work. In order for someone to deliberately get in your way they have to notice you, gauge the speed of your car, consider where you are headed, and swerve in at exactly the right time to cause you to slam on the brakes, yet not cause an accident. That is some effort. The simpler and thus more likely explanation is that they didn’t see you. It was a mistake. There was no intent. So why would you assume the former? Why do our minds make these kinds of connections when the logic says otherwise?

  The famous Linda problem, demonstrated by the psychologists Daniel Kahneman2 and Amos Tversky in a 1982 paper, is an illuminating example of how our minds work and why we need Hanlon’s Razor. It went like this:

  Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

  Which is more probable?

  Linda is a bank teller.

  Linda is a bank teller and is active in the feminist movement.

  The majority of respondents chose option 2. Why? The wording used to describe her suggests Linda is feminist. But Linda could only be a bank teller, or a feminist and a bank teller. So naturally the majority of students concluded she was both. They didn’t know anything about what she did, but because they were led to believe she had to be a feminist they couldn’t reject that option, even though the math of statistics makes it more likely that a single condition is true instead of multiple conditions. In other words, every feminist bank teller is a bank teller, but not every bank teller is a feminist.

  Thus, Kahneman and Tversky showed that students would, with enough vivid wording, assume it more likely that a liberal-leaning woman was both a feminist and a bank teller rather than simply a bank teller. They called it the “Fallacy of Conjunction.”

  With this experiment, and a host of others, Kahneman and Tversky exposed a sort of tic in our mental machinery: we’re deeply affected by vivid, available evidence, to such a degree that we’re willing to make judgments that violate simple logic. We over-conclude based on the available information. We have no trouble packaging in unrelated factors if they happen to occur in proximity to what we already believe.

  The Linda problem was later criticized as the psychologists setting their test subjects up for failure. If it was stated in a different way, subjects did not always make the error. But this of course was their point. If we present the evidence in a certain light, the brain malfunctions. It doesn’t weigh out the variables in a rational way.

  What does this have to do with Hanlon’s Razor? The connection is this:

  When we see something we don’t like happen and which seems wrong, we assume it’s intentional. But it’s more likely that it’s completely unintentional. Assuming someone is doing wrong and doing it purposefully is like assuming Linda is more likely to be a bank teller and a feminist. Most people doing wrong are not bad people trying to be malicious.

  With such vividness, and the associated emotional response, comes a sort of malfunctioning in our minds when we’re trying to diagnose the causes of a bad situation. That’s why we need Hanlon’s Razor as an important remedy. Failing to prioritize stupidity over malice causes things like paranoia. Always assuming malice puts you at the center of everyone else’s world. This is an incredibly self-centered approach to life. In reality, for every act of malice, there is almost certainly far more ignorance, stupidity, and laziness.

  «One is tempted to define man as a rational animal who always loses his temper when he is called upon to act in accordance with the dictates of reason.»

  Oscar Wilde3

  The end of an empire

  In 408 AD, Honorius was the Emperor of the Western Roman Empire. He assumed malicious intentions on the part of his best General, Stilicho, and had him executed. According to some historians, this execution may have been a key factor in the collapse of the Empire.4,5

  Why? Stilicho was an exceptional military general who won many campaigns for Rome. He was also very loyal to the Empire. He was not, however, perfect. Like all people, he made some decisions with negative outcomes. One of these was persuading the Roman Senate to accede to the demands of Alaric, leader of the Visigoths. Alaric had attacked the Empire multiple times and was no favorite in Rome. They didn’t want to give in to his threats and wanted to fight him.

  _

  Rome didn’t fall in a day. It was a decades-long crumble that saw a disbursement of power in Europe and a steady dismantling of the Roman infrastructure.

  Stilicho counseled against this. Perhaps he had a relationship with Alaric and thought he could convince him to join forces and push back against the other invaders Rome was dealing with. Regardless of his reasoning, this action of Stilicho’s compromised his reputation.

  Honorius was thus persuaded of the undesirability of having Stilicho around. Instead of defending him, or giving him the benefit of the doubt on the Alaric issue, Honorius assumed malicious intent behind Stilicho’s actions—that he wanted the throne and so was making decisions to shore up his power. Honorius ordered the general’s arrest and likely supported his execution.

  Without Stilicho to influence the relationship with the Goths, the Empire became a military disaster. Alaric sacked Rome two years later, the first barbarian to capture the city in nearly eight centuries. Rome was thus compromised, a huge contributing factor to the collapse of the Western Roman Empire.

  Hanlon’s Razor, when practiced diligently as a counter to confirmation bias, empowers us, and gives us far more realistic and effective options for remedying bad situations. When we assume someone is out to get us, our very natural instinct is to take actions to defend ourselves. It’s harder to take advantage of, or even see, opportunities while in this defensive mode because our priority is saving ourselves—which tends to reduce our vision to dealing with the perceived threat instead of examining the bigger picture.

  _

  By not assuming the worst, Vasili Arkhipov single-handedly avoided nuclear war with the Americans.

  The man who saved the world

  On October 27, 1962, Vasili Arkhipov stayed calm, didn’t assume malice, and saved the world. Seriously.

  This was the height of the Cuban missile crisis. Tensions were high between the United States and the Soviet Union. The world felt on the verge of nuclear war, a catastrophic outcome for all.

  American destroyers and Soviet subs were in a standoff in the waters off Cuba. Althoug
h they were technically in International waters, the Americans had informed the Soviets that they would be dropping blank depth charges to force the Soviet submarines to surface. The problem was, Soviet HQ had failed to pass this information along, so the subs in the area were ignorant of the planned American action.6

  Arkhipov was an officer aboard Soviet sub B-59—a sub that, unbeknownst to the Americans, was carrying a nuclear weapon. When the depth charges began to detonate above them, the Soviets on board B-59 assumed the worst. Convinced that war had broken out, the captain of the sub wanted to arm and deploy the nuclear-tipped torpedo.

  This would have been an unprecedented disaster. It would have significantly changed the world as we know it, with both the geopolitical and nuclear fallout affecting us for decades. Luckily for us, the launch of the torpedo required all three senior officers on board to agree, and Arkhipov didn’t. Instead of assuming malice, he stayed calm and insisted on surfacing to contact Moscow.

  Although the explosions around the submarine could have been malicious, Arkhipov realized that to assume so would put the lives of billions in peril. Far better to suppose mistakes and ignorance, and base the decision not to launch on that. In doing so, he saved the world.

  They surfaced and returned to Moscow. Arkhipov wasn’t hailed as a hero until the record was declassified 40 years later, when documents revealed just how close the world had come to nuclear war.

  The Devil Fallacy

  Robert Heinlein’s character Doc Graves describes the Devil Fallacy in the 1941 sci-fi story “Logic of Empire”, as he explains the theory to another character:

  “I would say you’ve fallen into the commonest fallacy of all in dealing with social and economic subjects—the ‘devil’ theory. You have attributed conditions to villainy that simply result from stupidity…. You think bankers are scoundrels. They are not. Nor are company officials, nor patrons, nor the governing classes back on earth. Men are constrained by necessity and build up rationalizations to account for their acts.”

 

‹ Prev