The Black Swan

Home > Other > The Black Swan > Page 28
The Black Swan Page 28

by Nassim Nicholas Taleb


  Philosophers since Aristotle have taught us that we are deep-thinking animals, and that we can learn by reasoning. It took a while to discover that we do effectively think, but that we more readily narrate backward in order to give ourselves the illusion of understanding, and give a cover to our past actions. The minute we forgot about this point, the “Enlightenment” came to drill it into our heads for a second time.

  I’d rather degrade us humans to a level certainly above other known animals but not quite on a par with the ideal Olympian man who can absorb philosophical statements and act accordingly. Indeed, if philosophy were that effective, the self-help section of the local bookstore would be of some use in consoling souls experiencing pain—but it isn’t. We forget to philosophize when under strain.

  I’ll end this section on prediction with the following two lessons, one very brief (for the small matters), one rather lengthy (for the large, important decisions).

  Being a Fool in the Right Places

  The lesson for the small is: be human! Accept that being human involves some amount of epistemic arrogance in running your affairs. Do not be ashamed of that. Do not try to always withhold judgment—opinions are the stuff of life. Do not try to avoid predicting—yes, after this diatribe about prediction I am not urging you to stop being a fool. Just be a fool in the right places.*

  What you should avoid is unnecessary dependence on large-scale harmful predictions—those and only those. Avoid the big subjects that may hurt your future: be fooled in small matters, not in the large. Do not listen to economic forecasters or to predictors in social science (they are mere entertainers), but do make your own forecast for the picnic. By all means, demand certainty for the next picnic; but avoid government social-security forecasts for the year 2040.

  Know how to rank beliefs not according to their plausibility but by the harm they may cause.

  Be Prepared

  The reader might feel queasy reading about these general failures to see the future and wonder what to do. But if you shed the idea of full predictability, there are plenty of things to do provided you remain conscious of their limits. Knowing that you cannot predict does not mean that you cannot benefit from unpredictability.

  The bottom line: be prepared! Narrow-minded prediction has an analgesic or therapeutic effect. Be aware of the numbing effect of magic numbers. Be prepared for all relevant eventualities.

  THE IDEA OF POSITIVE ACCIDENT

  Recall the empirics, those members of the Greek school of empirical medicine. They considered that you should be open-minded in your medical diagnoses to let luck play a role. By luck, a patient might be cured, say, by eating some food that accidentally turns out to be the cure for his disease, so that the treatment can then be used on subsequent patients. The positive accident (like hypertension medicine producing side benefits that led to Viagra) was the empirics’ central method of medical discovery.

  This same point can be generalized to life: maximize the serendipity around you.

  Sextus Empiricus retold the story of Apelles the Painter, who, while doing a portrait of a horse, was attempting to depict the foam from the horse’s mouth. After trying very hard and making a mess, he gave up and, in irritation, took the sponge he used for cleaning his brush and threw it at the picture. Where the sponge hit, it left a perfect representation of the foam.

  Trial and error means trying a lot. In The Blind Watchmaker, Richard Dawkins brilliantly illustrates this notion of the world without grand design, moving by small incremental random changes. Note a slight disagreement on my part that does not change the story by much: the world, rather, moves by large incremental random changes.

  Indeed, we have psychological and intellectual difficulties with trial and error, and with accepting that series of small failures are necessary in life. My colleague Mark Spitznagel understood that we humans have a mental hang-up about failures: “You need to love to lose” was his motto. In fact, the reason I felt immediately at home in America is precisely because American culture encourages the process of failure, unlike the cultures of Europe and Asia where failure is met with stigma and embarrassment. America’s specialty is to take these small risks for the rest of the world, which explains this country’s disproportionate share in innovations. Once established, an idea or a product is later “perfected” over there.

  Volatility and Risk of Black Swan

  People are often ashamed of losses, so they engage in strategies that produce very little volatility but contain the risk of a large loss—like collecting nickels in front of steamrollers. In Japanese culture, which is ill-adapted to randomness and badly equipped to understand that bad performance can come from bad luck, losses can severely tarnish someone’s reputation. People hate volatility, thus engage in strategies exposed to blowups, leading to occasional suicides after a big loss.

  Furthermore, this trade-off between volatility and risk can show up in careers that give the appearance of being stable, like jobs at IBM until the 1990s. When laid off, the employee faces a total void: he is no longer fit for anything else. The same holds for those in protected industries. On the other hand, consultants can have volatile earnings as their clients’ earnings go up and down, but face a lower risk of starvation, since their skills match demand—fluctuat nec mergitur (fluctuates but doesn’t sink). Likewise, dictatorships that do not appear volatile, like, say, Syria or Saudi Arabia, face a larger risk of chaos than, say, Italy, as the latter has been in a state of continual political turmoil since the second war. I learned about this problem from the finance industry, in which we see “conservative” bankers sitting on a pile of dynamite but fooling themselves because their operations seem dull and lacking in volatility.

  Barbell Strategy

  I am trying here to generalize to real life the notion of the “barbell” strategy I used as a trader, which is as follows. If you know that you are vulnerable to prediction errors, and if you accept that most “risk measures” are flawed, because of the Black Swan, then your strategy is to be as hyperconservative and hyperaggressive as you can be instead of being mildly aggressive or conservative. Instead of putting your money in “medium risk” investments (how do you know it is medium risk? by listening to tenure-seeking “experts”?), you need to put a portion, say 85 to 90 percent, in extremely safe instruments, like Treasury bills—as safe a class of instruments as you can manage to find on this planet. The remaining 10 to 15 percent you put in extremely speculative bets, as leveraged as possible (like options), preferably venture capital–style portfolios.* That way you do not depend on errors of risk management; no Black Swan can hurt you at all, beyond your “floor,” the nest egg that you have in maximally safe investments. Or, equivalently, you can have a speculative portfolio and insure it (if possible) against losses of more than, say, 15 percent. You are “clipping” your incomputable risk, the one that is harmful to you. Instead of having medium risk, you have high risk on one side and no risk on the other. The average will be medium risk but constitutes a positive exposure to the Black Swan. More technically, this can be called a “convex” combination. Let us see how this can be implemented in all aspects of life.

  “Nobody Knows Anything”

  The legendary screenwriter William Goldman was said to have shouted “Nobody knows anything!” in relation to the prediction of movie sales. Now, the reader may wonder how someone as successful as Goldman can figure out what to do without making predictions. The answer stands perceived business logic on its head. He knew that he could not predict individual events, but he was well aware that the unpredictable, namely a movie turning into a blockbuster, would benefit him immensely.

  So the second lesson is more aggressive: you can actually take advantage of the problem of prediction and epistemic arrogance! As a matter of fact, I suspect that the most successful businesses are precisely those that know how to work around inherent unpredictability and even exploit it.

  Recall my discussion of the biotech company whose managers understood that
the essence of research is in the unknown unknowns. Also, notice how they seized on the “corners,” those free lottery tickets in the world.

  Here are the (modest) tricks. But note that the more modest they are, the more effective they will be.

  First, make a distinction between positive contingencies and negative ones. Learn to distinguish between those human undertakings in which the lack of predictability can be (or has been) extremely beneficial and those where the failure to understand the future caused harm. There are both positive and negative Black Swans. William Goldman was involved in the movies, a positive–Black Swan business. Uncertainty did occasionally pay off there.

  A negative–Black Swan business is one where the unexpected can hit hard and hurt severely. If you are in the military, in catastrophe insurance, or in homeland security, you face only downside. Likewise, as we saw in Chapter 7, if you are in banking and lending, surprise outcomes are likely to be negative for you. You lend, and in the best of circumstances you get your loan back—but you may lose all of your money if the borrower defaults. In the event that the borrower enjoys great financial success, he is not likely to offer you an additional dividend.

  Aside from the movies, examples of positive–Black Swan businesses are: some segments of publishing, scientific research, and venture capital. In these businesses, you lose small to make big. You have little to lose per book and, for completely unexpected reasons, any given book might take off. The downside is small and easily controlled. The problem with publishers, of course, is that they regularly pay up for books, thus making their upside rather limited and their downside monstrous. (If you pay $10 million for a book, your Black Swan is it not being a bestseller.) Likewise, while technology can carry a great payoff, paying for the hyped-up story, as people did with the dot-com bubble, can make any upside limited and any downside huge. It is the venture capitalist who invested in a speculative company and sold his stake to unimaginative investors who is the beneficiary of the Black Swan, not the “me, too” investors.

  In these businesses you are lucky if you don’t know anything—particularly if others don’t know anything either, but aren’t aware of it. And you fare best if you know where your ignorance lies, if you are the only one looking at the unread books, so to speak. This dovetails into the “barbell” strategy of taking maximum exposure to the positive Black Swans while remaining paranoid about the negative ones. For your exposure to the positive Black Swan, you do not need to have any precise understanding of the structure of uncertainty. I find it hard to explain that when you have a very limited loss you need to get as aggressive, as speculative, and sometimes as “unreasonable” as you can be.

  Middlebrow thinkers sometimes make the analogy of such strategy with that of collecting “lottery tickets.” It is plain wrong. First, lottery tickets do not have a scalable payoff; there is a known upper limit to what they can deliver. The ludic fallacy applies here—the scalability of real-life payoffs compared to lottery ones makes the payoff unlimited or of unknown limit. Secondly, the lottery tickets have known rules and laboratory-style well-presented possibilities; here we do not know the rules and can benefit from this additional uncertainty, since it cannot hurt you and can only benefit you.*

  Don’t look for the precise and the local. Simply, do not be narrow-minded. The great discoverer Pasteur, who came up with the notion that chance favors the prepared, understood that you do not look for something particular every morning but work hard to let contingency enter your working life. As Yogi Berra, another great thinker, said, “You got to be very careful if you don’t know where you’re going, because you might not get there.”

  Likewise, do not try to predict precise Black Swans—it tends to make you more vulnerable to the ones you did not predict. My friends Andy Marshall and Andrew Mays at the Department of Defense face the same problem. The impulse on the part of the military is to devote resources to predicting the next problems. These thinkers advocate the opposite: invest in preparedness, not in prediction.

  Remember that infinite vigilance is just not possible.

  Seize any opportunity, or anything that looks like opportunity. They are rare, much rarer than you think. Remember that positive Black Swans have a necessary first step: you need to be exposed to them. Many people do not realize that they are getting a lucky break in life when they get it. If a big publisher (or a big art dealer or a movie executive or a hotshot banker or a big thinker) suggests an appointment, cancel anything you have planned: you may never see such a window open up again. I am sometimes shocked at how little people realize that these opportunities do not grow on trees. Collect as many free nonlottery tickets (those with open-ended payoffs) as you can, and, once they start paying off, do not discard them. Work hard, not in grunt work, but in chasing such opportunities and maximizing exposure to them. This makes living in big cities invaluable because you increase the odds of serendipitous encounters—you gain exposure to the envelope of serendipity. The idea of settling in a rural area on grounds that one has good communications “in the age of the Internet” tunnels out of such sources of positive uncertainty. Diplomats understand that very well: casual chance discussions at cocktail parties usually lead to big breakthroughs—not dry correspondence or telephone conversations. Go to parties! If you’re a scientist, you will chance upon a remark that might spark new research. And if you are autistic, send your associates to these events.

  Beware of precise plans by governments. As discussed in Chapter 10, let governments predict (it makes officials feel better about themselves and justifies their existence) but do not set much store by what they say. Remember that the interest of these civil servants is to survive and self-perpetuate—not to get to the truth. It does not mean that governments are useless, only that you need to keep a vigilant eye on their side effects. For instance, regulators in the banking business are prone to a severe expert problem and they tend to condone reckless but (hidden) risk taking. Andy Marshall and Andy Mays asked me if the private sector could do better in predicting. Alas, no. Once again, recall the story of banks hiding explosive risks in their portfolios. It is not a good idea to trust corporations with matters such as rare events because the performance of these executives is not observable on a short-term basis, and they will game the system by showing good performance so they can get their yearly bonus. The Achilles’ heel of capitalism is that if you make corporations compete, it is sometimes the one that is most exposed to the negative Black Swan that will appear to be the most fit for survival. Also recall from the footnote on Ferguson’s discovery in Chapter 1 that markets are not good predictors of wars. No one in particular is a good predictor of anything. Sorry.

  “There are some people who, if they don’t already know, you can’t tell ’em,” as the great philosopher of uncertainty Yogi Berra once said. Do not waste your time trying to fight forecasters, stock analysts, economists, and social scientists, except to play pranks on them. They are considerably easy to make fun of, and many get angry quite readily. It is ineffective to moan about unpredictability: people will continue to predict foolishly, especially if they are paid for it, and you cannot put an end to institutionalized frauds. If you ever do have to heed a forecast, keep in mind that its accuracy degrades rapidly as you extend it through time.

  If you hear a “prominent” economist using the word equilibrium, or normal distribution, do not argue with him; just ignore him, or try to put a rat down his shirt.

  The Great Asymmetry

  All these recommendations have one point in common: asymmetry. Put yourself in situations where favorable consequences are much larger than unfavorable ones.

  Indeed, the notion of asymmetric outcomes is the central idea of this book: I will never get to know the unknown since, by definition, it is unknown. However, I can always guess how it might affect me, and I should base my decisions around that.

  This idea is often erroneously called Pascal’s wager, after the philosopher and (thinking) mathematician Blaise Pascal.
He presented it something like this: I do not know whether God exists, but I know that I have nothing to gain from being an atheist if he does not exist, whereas I have plenty to lose if he does. Hence, this justifies my belief in God.

  Pascal’s argument is severely flawed theologically: one has to be naïve enough to believe that God would not penalize us for false belief. Unless, of course, one is taking the quite restrictive view of a naïve God. (Bertrand Russell was reported to have claimed that God would need to have created fools for Pascal’s argument to work.)

  But the idea behind Pascal’s wager has fundamental applications outside of theology. It stands the entire notion of knowledge on its head. It eliminates the need for us to understand the probabilities of a rare event (there are fundamental limits to our knowledge of these); rather, we can focus on the payoff and benefits of an event if it takes place. The probabilities of very rare events are not computable; the effect of an event on us is considerably easier to ascertain (the rarer the event, the fuzzier the odds). We can have a clear idea of the consequences of an event, even if we do not know how likely it is to occur. I don’t know the odds of an earthquake, but I can imagine how San Francisco might be affected by one. This idea that in order to make a decision you need to focus on the consequences (which you can know) rather than the probability (which you can’t know) is the central idea of uncertainty. Much of my life is based on it.

  You can build an overall theory of decision making on this idea. All you have to do is mitigate the consequences. As I said, if my portfolio is exposed to a market crash, the odds of which I can’t compute, all I have to do is buy insurance, or get out and invest the amounts I am not willing to ever lose in less risky securities.

 

‹ Prev