Scott Adams and Philosophy

Home > Other > Scott Adams and Philosophy > Page 8
Scott Adams and Philosophy Page 8

by Daniel Yim


  Be like Dilbert in the February 2010 strip when he realizes during his work week he produced nothing but useless PowerPoint slides. He screams in agony at the thought, “My brain is eating my body . . .”

  I’ve said enough. Do your own research. Me? I’m off to Facebook to post a picture of a plate of food I’m going to eat tonight.

  III

  It Tastes Better if We All Do It Together

  7

  Scott Adams’s Joy of Logic

  RICHARD BILSKER

  Among the morbid delights of the Dilbert comic strip are the delicious instances of bad thinking displayed by the Pointy-Haired Boss. In his book, The Joy of Work: Dilbert’s Guide to Finding Happiness at the Expense of Your Co-Workers, Scott Adams includes a chapter called “Managing Your Co-Workers.” This has long been a favorite of mine and its ideas have found their way into logic classes I have taught over many years.

  After discussing such heady topics as cubicle flatulence and office moves, the chapter has a section on “Dealing with Irrational Co-Workers.” This section includes instances of logical fallacies and cognitive biases interspersed with Dilbert strips. Understanding these examples can certainly help you to become a better critical thinker.

  Cognitive Biases and Logical Fallacies

  Paul Herrick has described cognitive biases as psychological obstacles to critical thinking. They are patterns of pre-logical thinking that make it harder for us to use reason to draw conclusions. Though there are some controversies regarding the number of biases and their extent, there is an evolving set of biases recognized (and tested for) by psychologists and behavioral economists. Some, like being self-centered (egocentric) or being susceptible to prejudice, have been long recognized.

  Fallacies are errors in reasoning. Formal fallacies are problems with logical structure in systems of deduction. Informal fallacies are patterns of reasoning that are so common that many of the oldest have been discussed since the medieval period and are commonly still referred to by their Latin names.

  Irrational Co-Workers (and Others)

  In “Dealing with Irrational Co-Workers,” Scott Adams gives us a humorous lesson in critical thinking. In most cases, he has changed the names . . . but probably not to protect the innocent. As he notes, “Nothing can reduce your happiness faster than an argument with an irrational co-worker.” His solution: since “irrational people are easily persuaded by anything that has been published,” use his book as the publication that persuades them they are wrong! Just photocopy this section of the book (“You Are Wrong Because . . .”) and circle the number of the appropriate example. He provides thirty-two examples. Here are some I have found most instructive.

  Analogy Arguments

  Analogy arguments are an important part of science and law. When drugs are tested on animals, the only reason to draw conclusions about humans is based on the similarities between the relevant biological systems of these species. The same is true in law when a judge has to make a ruling on allowing evidence, for example. Is the instance in front of me, she must consider, more like the precedent in U.S. v Jones or more like the precedent in U.S. v Smith? However, as the saying goes, you shouldn’t compare apples and oranges.

  A common fallacy is “faulty analogy.” This is a catch-all description for analogies that do not consider relevant features or circumstances. Adams’s example is called “Amazingly Bad Analogy” and goes like this: “You can train a dog to fetch a stick. Therefore, you can train a potato to dance.” The problem here is that not enough (or any, really) relevant similarities between dogs and potatoes have been illustrated. Adams has another faulty analogy example called “Irrelevant Comparisons”: “A hundred dollars is a good price for a toaster, compared to buying a Ferrari.” Toasters to Ferraris is another kind of apples to oranges (or dogs to potatoes) example.

  The English philosopher Sir Francis Bacon (1561–1626) is often credited with laying out some of the basics of scientific method. He can also be credited with this bad analogy in his essay, “Of the True Greatness of Kingdoms and Estates”:

  No body can be healthful without exercise, neither natural body nor politic; and certainly to a kingdom or estate, a just and honorable war is the true exercise. A civil war, indeed, is like the heat of a fever; but a foreign war is like the heat of exercise, and serveth to keep the body in health; for in a slothful peace, both courages will effeminate and manners corrupt.

  Much like Toasters/Ferraris, Apples/Oranges, and Dogs/Potatoes, Human Body/Body Politic do not compare. It’s also not clear what justifies the analogy within an analogy here. How is civil war like a fever and a foreign war like exercise? He does not elaborate.

  Causal Arguments

  Causal arguments are constructed to give us good reason to accept some conclusion of the form “C caused E.” You might want to know why a group of people in a particular region all developed similar symptoms. What, if anything did they have in common? What did they do differently from family members who didn’t have those symptoms? If you pay attention to the news, you may often note that E. coli or some other bacteria was found in some food source the victims were all exposed to at some point. The headline might be something like “E. coli in Burger Meat Found to be the Cause of Local Deaths and Hospitalizations.” Causal arguments are common in science.

  One common problem is “confusing correlation with cause.” A positive correlation is when you note that the presence of something seems to go with some other thing being present, too. A negative correlation is when the presence of something seems to go with the absence of some other thing (or when A goes up, B goes down). In economics, it is often said that unemployment rates and inflation rates are negatively correlated.

  Adams’s example is called “Faulty Cause and Effect”: “On the basis of my observations, wearing huge pants make you fat.” When two things are correlated, there are four possibilities. Either A caused B, B caused A, A and B are both caused by some third thing C, or lastly, it is coincidence. Ideally, to determine cause, you need to rule out the other three possibilities. That seems to be what happened in the Adams example. While it’s true that wearing large pants is correlated with being overweight, it is more likely from a causal standpoint that being overweight is responsible for the size of the pants, rather than the other way around. Let’s look at a recent example that has had more widespread impact than Adams’s huge pants.

  Economists Carmen Reinhart and Kenneth Rogoff claimed in a 2010 paper, that carrying too much debt caused a nation’s economic growth to slow. According to their data, which were not published in the article itself, when the debt-to-GDP ratio of a nation crosses the threshold of ninety percent, growth slows. Their conclusion was used as an impetus for what are called austerity measures by Paul Ryan in the United States (his budget and “Path to Prosperity”) and by George Osborne who was Chancellor of the Exchequer in the United Kingdom from 2010 to 2016.

  Austerity typically means you cut government spending to reduce the debt. Usually social services are among the first cuts. So, austerity can have wide-ranging consequences. In 2013, Thomas Herndon, Michael Ash, and Robert Pollin, three economists at the University of Massachusetts-Amherst published a working paper (using data provided to them by Reinhart and Rogoff) that points out a number of flaws in the Reinhart and Rogoff paper. One problem is that the data do not show cause, but only correlation. Paul Krugman, the 2008 Nobel Prize–winner for Economics, goes further and argues that it seems in several cases debt only goes up after the growth slows.

  Another fallacy is “complex cause.” This occurs when you oversimplify a situation down to one cause out of many, typically a minor one or one favored by the speaker. This is common in arguments by politicians, when they want to focus on one thing that their party is against. Adams’s version is “Inability to Understand that Some Things Have Multiple Causes”: “The Beatles were popular for one reason only: They were good singers.” Much like the causes of World War I, there were many factors responsibl
e for the popularity of The Beatles.

  “Objectionable (or false) cause” is a catch-all fallacy for drawing a causal conclusion from too little causal evidence. Adams gives us “Reaching Bizarre Conclusions without Any Information”: “The car won’t start. I’m certain the spark plugs have been stolen by rogue clowns.” Presumably, your co-worker made this proclamation before checking whether the spark plugs are still there. Without any really good evidence, this probably is not the first move you should make.

  In 1965 Gilbert Harman named a form of inductive argument called Inference to the Best Explanation. The hallmark of this kind of argument is that you start with what you want to explain (or determine the cause of) and then consider what might possibly explain it. Then using some criteria such as simplicity, explanatory power, and consistency, narrow it down to the one that seems best based on the currently available data. Many, if not most, of the episodes of the television show House were a series of Inferences to the Best Explanation. Dr. House and his team would be presented with a medical enigma. They would list all the features of the mystery and toss around ideas of what condition would explain the phenomena. Once they had chosen the best one, they would treat the patient based on the diagnosis. If the treatment didn’t work, they could eliminate one possible cause and also add new information to the data for a new round of diagnosis.

  The criterion of simplicity (sometimes called “parsimony” or “elegance”) is the idea that if you can explain something with a simpler hypothesis it is better to do so than to rely on a more complicated one. This was part of Copernicus’s motivation for the heliocentric theory, for example (though his finished product was more complicated).

  A famous name for this principle is Occam’s Razor and was named for medieval English philosopher William of Ockham (around 1287–1347). He said something like “Don’t multiply entities without necessity.” The razor is to “shave” off the unnecessary. This idea, too, gets the Adams treatment as “Overapplication of Occam’s Razor”: “The simplest explanation for the moon landings is that they were hoaxes.”

  The problem here is that is not clear how this would be simpler. Given the number of people involved in the work (NASA, the media, ordinary people who witnessed the launches, and so forth), it looks as if a hoax would be much less simple. A good friend of mine who is a project manager at NASA still gets emails about hoaxes and conspiracies. History shows it is very hard to keep people quiet long enough to have a successful conspiracy.

  Another example that is similar is “Ignoring All Anecdotal Evidence”: “I always get hives immediately after eating strawberries. But without a scientifically controlled experiment, it’s not reliable data. So, I continue to eat strawberries every day, since I can’t tell if they cause hives.” This is another piece of poor causal reasoning that violates Occam’s Razor and probably IBE standards, too.

  Authority

  One handy shortcut based on the availability of accumulated sciences, is that you do not need to do all the science yourself. If there is widespread (near-unanimous) agreement about something, you are able to argue for a conclusion based on expert authority, provided that your chosen expert is a respected authority in their field, representative of that widespread agreement, and is speaking within their area of expertise. Examples might include Albert Einstein on relativity theory or Stephen Hawking on cosmology.

  The “Fallacy of Misplaced Authority” occurs when your chosen representative does not meet the standards mentioned above. In 1986, an actor from the soap opera All My Children was in an ad for Vicks 44, a cough suppressant. In the ad, Peter Bergman utters the line, “I’m not a doctor, but I play one on TV.” The implication is that playing a doctor is close enough.

  Scott Adams gives us “Following the Advice of Known Idiots”: “Uncle Billy says pork makes you smarter. That’s good enough for me!” Nothing stated here makes Uncle Billy an expert in either nutrition or cognition. At the same time, Adams does not provide any reasons to call Uncle Billy a “known idiot.”

  Begging the Question

  The fallacy of “begging the question” occurs when an argument asserts the conclusion as one of its premises. Usually, it is not so bold as to use the same exact words. A simple example would be concluding someone is famous based on the premise that they are well-known. The problem arises because “well-known” and “famous” are synonymous. Circular arguments (or vicious circles) are often described as a subset of question-begging. The circle may be small or large, but the logical form might be something like A because B. How do you know B? Because C. How do you know C? Because A.

  A common example would be “Scientology is the correct way to view the universe. It clearly says so in Dianetics.” Adams does not have a funny name for his version, but his example is one we have probably all heard from a co-worker at some point in our working life. “Circular Reasoning”: “I’m correct because I’m smarter than you. And I must be smarter than you because I’m correct.” Question-begging and vicious circles are not helpful because you are not given any reason to accept the conclusion that is not the conclusion itself (or outside the circle).

  Cognitive Biases

  The Joy of Work also has some examples that might be better described as cognitive biases. My favorite is one Scott Adams calls “I Am the World”: “I don’t listen to country music. Therefore, country music is not popular.” This has similarities to the biases of “egocentrism” (I count more than others), “first-person bias” (evaluating the good of others based on the good for us), and “false consensus” (the idea that there is widespread agreement about something when there isn’t). There are also similarities to what is often called “the psychologist’s fallacy” (the “similar to me” stereotype).

  Stereotyping is another bias. Adams has a version of this called “The Few Are the Same as the Whole”: “Some Elbonians are animal rights activists. Some Elbonians wear fur coats. Therefore, Elbonians are hypocrites.” This example also has the problem of violating one of the rules for categorical arguments. Namely that you cannot have an “all” conclusion if both your premises have “some.”

  Confirmation Bias

  One of the running themes in Scott Adams’s most recent book, Win Bigly: Persuasion in a World Where Facts Don’t Matter, is confirmation bias. Paul Herrick describes confirmation bias as “the unconscious tendency to look harder for confirming evidence than for disconfirming evidence when investigating a matter.” Typically, this occurs when we have a vested interest in what we are trying to confirm. There are instances of confirmation bias in the 2010 Reinhart and Rogoff paper mentioned above. This was not discovered until 2013 when Herndon, Ash, and Pollin got access to the data. It seems that Reinhart and Rogoff cherry-picked their data excluding some that did not support their claims. Further, there were math errors that suggested that growth does slow with higher carried debt, but instead of the negative growth (–0.1 percent) that they claimed, growth only slowed to 2.2 percent, which is not significantly slower growth than the 2.8 percent they claimed for the 60–90 percent debt ratio.

  By the time these errors were found, people were already using their data for policy decisions. In fact, those who supported austerity did not change their tune once the new data was published. This, too, is confirmation bias. For Adams, confirmation bias is a big part of explaining Donald Trump’s win in the 2016 election and why his supporters do not change their minds about him. Facts don’t matter. As he puts it, “People don’t change opinions about emotional topics just because some information proved their opinion to be nonsense. Humans aren’t wired that way.”

  When discussing cognitive biases in my classes, I point out that they are unconscious, pre-logical patterns. The best we can do is become aware of them and try not to let them into our conscious logical thinking. On this view, cognitive biases are like programs running in the background of your computer. Adams sees it more radically. To him, the bias “isn’t an occasional bug in our human operating system. It is the op
erating system.” If he is right, then much of our attempt at being better critical thinkers is a waste of time. So would be our attempts to reason with our co-workers, or anyone else, about the world.

  Hardwired for Nonsense?

  When The Joy of Work was published, I was already a fan of Dilbert. I found the examples in the text useful for teaching logical fallacies, because they are more extreme examples than the ones found in logic textbooks which tend to be rather dry. Twenty years on, I still find the chapter to be highly entertaining.

  I have a different reaction to Win Bigly. Much of the book is concerned with how Scott Adams was able to predict what seemed at the time to be the long shot of a Trump victory. Other parts of the book explain the success of Trump’s speaking style, which to the academic ear often sounds like nonsense. Although I take many of the pronouncements in Win Bigly with a grain of salt, I worry about the implications. What if we really are hardwired to accept nonsense as long as it’s presented in a certain way? What if the fact that I think I’m not susceptible to these persuasion techniques is just me patting myself on the back while I fall into different (but similar traps)? There are no easy answers here.

  My takeaway is that I think Adams overstates how incapable we are of overcoming biases. If Adams really is as good a “commercial grade persuader” as he tells us throughout the book, would he really need to restate it so often? I recommend that you read Win Bigly and draw your own conclusions. Aside from that, I recommend that you enjoy Dilbert strips and The Joy of Work—and try to avoid the common critical thinking traps I’ve mentioned here.

 

‹ Prev