Book Read Free

Antifragile: Things That Gain from Disorder

Page 36

by Taleb, Nassim Nicholas


  I have often followed what I call Bergson’s razor: “A philosopher should be known for one single idea, not more” (I can’t source it to Bergson, but the rule is good enough). The French essayist and poet Paul Valéry once asked Einstein if he carried a notebook to write down ideas. “I never have ideas” was the reply (in fact he just did not have chickens***t ideas). So, a heuristic: if someone has a long bio, I skip him—at a conference a friend invited me to have lunch with an overachieving hotshot whose résumé “can cover more than two or three lives”; I skipped to sit at a table with the trainees and stage engineers.2 Likewise when I am told that someone has three hundred academic papers and twenty-two honorary doctorates, but no other single compelling contribution or main idea behind it, I avoid him like the bubonic plague.

  1 Recall that the overediting interventionist missed the main mistake in Chapter 7. The 663-page document Financial Crisis Inquiry Report by the Financial Crisis Inquiry Commission missed what I believe are the main reasons: fragility and absence of skin in the game. But of course they listed every possible epiphenomenon you can think of as cause.

  2 Even the Nobel, with all its ills of inducing competition in something as holy as science, is not granted for a collection of papers but rarely for more than a single, but major, contribution.

  CHAPTER 20

  Time and Fragility

  Prophecy, like knowledge, is subtractive, not additive—The Lindy effect, or how the old prevails over the new, especially in technology, no matter what they say in California—Prophecy not a recommended and voluntary career

  Antifragility implies—contrary to initial instinct—that the old is superior to the new, and much more than you think. No matter how something looks to your intellectual machinery, or how well or poorly it narrates, time will know more about its fragilities and break it when necessary. Here, I expose a contemporary disease—linked to interventionism—called neomania, which brings fragility but I believe may be treatable if one is patient enough.

  What survives must be good at serving some (mostly hidden) purpose that time can see but our eyes and logical faculties can’t capture. In this chapter we use the notion of fragility as a central driver of prediction.

  Recall the foundational asymmetry: the antifragile benefits from volatility and disorder, the fragile is harmed. Well, time is the same as disorder.

  FROM SIMONIDES TO JENSEN

  As an exercise in the use of the distinction between fragility and antifragility, let us play prophet, with the understanding that it is not a good career choice unless you have a thick skin, a good circle of friends, little access to the Internet, a library with a good set of ancient proverbs, and, if possible, the ability to derive personal benefits from your prophecy. As shown from the track record of the prophets: before you are proven right, you will be reviled; after you are proven right, you will be hated for a while, or, what’s worse, your ideas will appear to be “trivial” thanks to retrospective distortion. This makes it far more convincing to follow the Fat Tony method of focusing on shekels more than recognition. And such treatment has continued in modern times: twentieth-century intellectuals who have embraced the wrong ideas, such as Communism or even Stalinism, have remained fashionable—and their books remain on the bookstore shelves—while those who, like the political philosopher Raymond Aron, saw the problems got short shrift both before and after being acknowledged as having seen things right.

  Now close your eyes and try to imagine your future surroundings in, say, five, ten, or twenty-five years. Odds are your imagination will produce new things in it, things we call innovation, improvements, killer technologies, and other inelegant and hackneyed words from the business jargon. These common concepts concerning innovation, we will see, are not just offensive aesthetically, but they are nonsense both empirically and philosophically.

  Why? Odds are that your imagination will be adding things to the present world. I am sorry, but I will show in this chapter that this approach is exactly backward: the way to do it rigorously, according to the notions of fragility and antifragility, is to take away from the future, reduce from it, simply, things that do not belong to the coming times. Via negativa. What is fragile will eventually break; and, luckily, we can easily tell what is fragile. Positive Black Swans are more unpredictable than negative ones.

  “Time has sharp teeth that destroy everything,” declaimed the sixth-century (B.C.) poet Simonides of Ceos, perhaps starting a tradition in Western literature about the inexorable effect of time. I can trace a plethora of elegant classical expressions, from Ovid (tempus edax rerum—time devours everything) to the no less poetic twentieth-century Franco-Russian poetess Elsa Triolet (“time burns but leaves no ashes”). Naturally, this exercise triggered some poetic waxing, so I am now humming a French poem put to music titled “Avec le temps” about how time erases things, even bad memories (though it doesn’t say that it erases us as well in the process). Now, thanks to convexity effects, we can put a little bit of science in these, and produce our own taxonomy of what should be devoured the fastest by that inexorable time. The fragile will eventually break—and, luckily, we are capable of figuring out what is fragile. Even what we believe is antifragile will eventually break, but it should take much, much longer to do so (wine does well with time, but up to a point; and not if you put it in the crater of a volcano).

  The verse by Simonides that started the previous paragraph continues with the stipulation “even the most solid.” So Simonides had the adumbration of the idea, quite useful, that the most solid will be swallowed with more difficulty, hence last. Naturally, he did not think that something could be antifragile, hence never swallowed.

  Now, I insist on the via negativa method of prophecy as being the only valid one: there is no other way to produce a forecast without being a turkey somewhere, particularly in the complex environment in which we live today. Now, I am not saying that new technologies will not emerge—something new will rule its day, for a while. What is currently fragile will be replaced by something else, of course. But this “something else” is unpredictable. In all likelihood, the technologies you have in your mind are not the ones that will make it, no matter your perception of their fitness and applicability—with all due respect to your imagination.

  Recall that the most fragile is the predictive, what is built on the basis of predictability—in other words, those who underestimate Black Swans will eventually exit the population.

  An interesting apparent paradox is that, according to these principles, longer-term predictions are more reliable than short-term ones, given that one can be quite certain that what is Black Swan–prone will be eventually swallowed by history since time augments the probability of such an event. On the other hand, typical predictions (not involving the currently fragile) degrade with time; in the presence of nonlinearities, the longer the forecast the worse its accuracy. Your error rate for a ten-year forecast of, say, the sales of a computer plant or the profits of a commodity vendor can be a thousand times that of a one-year projection.

  LEARNING TO SUBTRACT

  Consider the futuristic projections made throughout the past century and a half, as expressed in literary novels such as those by Jules Verne, H. G. Wells, or George Orwell, or in now forgotten narratives of the future produced by scientists or futurists. It is remarkable that the tools that seem to currently dominate the world, such as the Internet, or more mundane matters such as the wheel on the suitcase of Book IV, were completely missing from these forecasts. But it is not here that the major error lies. The problem is that almost everything that was imagined never took place, except for a few overexploited anecdotes (such as the steam engine by Hero the Alexandrian or the assault vehicle by Leonardo da Vinci). Our world looks too close to theirs, much closer to theirs than they ever imagined or wanted to imagine. And we tend to be blind to that fact—there seems to be no correcting mechanism that can make us aware of the point as we go along forecasting a highly technocratic future.

&n
bsp; There may be a selection bias: those people who engage in producing these accounts of the future will tend to have (incurable and untreatable) neomania, the love of the modern for its own sake.

  Tonight I will be meeting friends in a restaurant (tavernas have existed for at least twenty-five centuries). I will be walking there wearing shoes hardly different from those worn fifty-three hundred years ago by the mummified man discovered in a glacier in the Austrian Alps. At the restaurant, I will be using silverware, a Mesopotamian technology, which qualifies as a “killer application” given what it allows me to do to the leg of lamb, such as tear it apart while sparing my fingers from burns. I will be drinking wine, a liquid that has been in use for at least six millennia. The wine will be poured into glasses, an innovation claimed by my Lebanese compatriots to come from their Phoenician ancestors, and if you disagree about the source, we can say that glass objects have been sold by them as trinkets for at least twenty-nine hundred years. After the main course, I will have a somewhat younger technology, artisanal cheese, paying higher prices for those that have not changed in their preparation for several centuries.

  Had someone in 1950 predicted such a minor gathering, he would have imagined something quite different. So, thank God, I will not be dressed in a shiny synthetic space-style suit, consuming nutritionally optimized pills while communicating with my dinner peers by means of screens. The dinner partners, in turn, will be expelling airborne germs on my face, as they will not be located in remote human colonies across the galaxy. The food will be prepared using a very archaic technology (fire), with the aid of kitchen tools and implements that have not changed since the Romans (except in the quality of some of the metals used). I will be sitting on an (at least) three-thousand-year-old device commonly known as the chair (which will be, if anything, less ornate that its majestic Egyptian ancestor). And I will be not be repairing to the restaurant with the aid of a flying motorcycle. I will be walking or, if late, using a cab from a century-old technology, driven by an immigrant—immigrants were driving cabs in Paris a century ago (Russian aristocrats), same as in Berlin and Stockholm (Iraqis and Kurdish refugees), Washington, D.C. (Ethiopian postdoc students), Los Angeles (musically oriented Armenians), and New York (multinationals) today.

  David Edgerton showed that in the early 2000s we produce two and a half times as many bicycles as we do cars and invest most of our technological resources in maintaining existing equipment or refining old technologies (note that this is not just a Chinese phenomenon: Western cities are aggressively trying to become bicycle-friendly). Also consider that one of the most consequential technologies seems to be the one people talk about the least: the condom. Ironically, it wants to look like less of a technology; it has been undergoing meaningful improvements, with the precise aim of being less and less noticeable.

  FIGURE 17. Cooking utensils from Pompeii, hardly different from those found in today’s (good) kitchens

  So, the prime error is as follows. When asked to imagine the future, we have the tendency to take the present as a baseline, then produce a speculative destiny by adding new technologies and products to it and what sort of makes sense, given an interpolation of past developments. We also represent society according to our utopia of the moment, largely driven by our wishes—except for a few people called doomsayers, the future will be largely inhabited by our desires. So we will tend to over-technologize it and underestimate the might of the equivalent of these small wheels on suitcases that will be staring at us for the next millennia.

  A word on the blindness to this over-technologizing. After I left finance, I started attending some of the fashionable conferences attended by pre-rich and post-rich technology people and the new category of technology intellectuals. I was initially exhilarated to see them wearing no ties, as, living among tie-wearing abhorrent bankers, I had developed the illusion that anyone who doesn’t wear a tie was not an empty suit. But these conferences, while colorful and slick with computerized images and fancy animations, felt depressing. I knew I did not belong. It was not just their additive approach to the future (failure to subtract the fragile rather than add to destiny). It was not entirely their blindness by uncompromising neomania. It took a while for me to realize the reason: a profound lack of elegance. Technothinkers tend to have an “engineering mind”—to put it less politely, they have autistic tendencies. While they don’t usually wear ties, these types tend, of course, to exhibit all the textbook characteristics of nerdiness—mostly lack of charm, interest in objects instead of persons, causing them to neglect their looks. They love precision at the expense of applicability. And they typically share an absence of literary culture.

  This absence of literary culture is actually a marker of future blindness because it is usually accompanied by a denigration of history, a byproduct of unconditional neomania. Outside of the niche and isolated genre of science fiction, literature is about the past. We do not learn physics or biology from medieval textbooks, but we still read Homer, Plato, or the very modern Shakespeare. We cannot talk about sculpture without knowledge of the works of Phidias, Michelangelo, or the great Canova. These are in the past, not in the future. Just by setting foot into a museum, the aesthetically minded person is connecting with the elders. Whether overtly or not, he will tend to acquire and respect historical knowledge, even if it is to reject it. And the past—properly handled, as we will see in the next section—is a much better teacher about the properties of the future than the present. To understand the future, you do not need technoautistic jargon, obsession with “killer apps,” these sort of things. You just need the following: some respect for the past, some curiosity about the historical record, a hunger for the wisdom of the elders, and a grasp of the notion of “heuristics,” these often unwritten rules of thumb that are so determining of survival. In other words, you will be forced to give weight to things that have been around, things that have survived.

  Technology at Its Best

  But technology can cancel the effect of bad technologies, by self-subtraction.

  Technology is at its best when it is invisible. I am convinced that technology is of greatest benefit when it displaces the deleterious, unnatural, alienating, and, most of all, inherently fragile preceding technology. Many of the modern applications that have managed to survive today came to disrupt the deleterious effect of the philistinism of modernity, particularly the twentieth century: the large multinational bureaucratic corporation with “empty suits” at the top; the isolated family (nuclear) in a one-way relationship with the television set, even more isolated thanks to car-designed suburban society; the dominance of the state, particularly the militaristic nation-state, with border controls; the destructive dictatorship on thought and culture by the established media; the tight control on publication and dissemination of economic ideas by the charlatanic economics establishment; large corporations that tend to control their markets now threatened by the Internet; pseudorigor that has been busted by the Web; and many others. You no longer have to “press 1 for English” or wait in line for a rude operator to make bookings for your honeymoon in Cyprus. In many respects, as unnatural as it is, the Internet removed some of the even more unnatural elements around us. For instance, the absence of paperwork makes bureaucracy—something modernistic—more palatable than it was in the days of paper files. With a little bit of luck a computer virus will wipe out all records and free people from their past mistakes.

  Even now, we are using technology to reverse technology. Recall my walk to the restaurant wearing shoes not too dissimilar to those worn by the ancient, preclassical person found in the Alps. The shoe industry, after spending decades “engineering” the perfect walking and running shoe, with all manner of “support” mechanisms and material for cushioning, is now selling us shoes that replicate being barefoot—they want to be so unobtrusive that their only claimed function is to protect our feet from the elements, not to dictate how we walk as the more modernistic mission was. In a way they are selling us th
e calloused feet of a hunter-gatherer that we can put on, use, and then remove upon returning to civilization. It is quite exhilarating to wear these shoes when walking in nature as one wakes up to a new dimension while feeling the three dimensions of the terrain. Regular shoes feel like casts that separate us from the environment. And they don’t have to be inelegant: the technology is in the sole, not the shoe, as the new soles can be both robust and very thin, thus allowing the foot to hug the ground as if one were barefoot—my best discovery is an Italian-looking moccasin made in Brazil that allows me to both run on stones and go to dinner in restaurants.

  Then again, perhaps they should just sell us reinforced waterproof socks (in effect, what the Alpine fellow had), but it would not be very profitable for these firms.1

  And the great use of the tablet computer (notably the iPad) is that it allows us to return to Babylonian and Phoenician roots of writing and take notes on a tablet (which is how it started). One can now jot down handwritten, or rather fingerwritten, notes—it is much more soothing to write longhand, instead of having to go through the agency of a keyboard. My dream would be to someday write everything longhand, as almost every writer did before modernity.

  So it may be a natural property of technology to only want to be displaced by itself.

  Next let me show how the future is mostly in the past.

  TO AGE IN REVERSE: THE LINDY EFFECT

  Time to get more technical, so a distinction is helpful at this stage. Let us separate the perishable (humans, single items) from the nonperishable, the potentially perennial. The nonperishable is anything that does not have an organic unavoidable expiration date. The perishable is typically an object, the nonperishable has an informational nature to it. A single car is perishable, but the automobile as a technology has survived about a century (and we will speculate should survive another one). Humans die, but their genes—a code—do not necessarily. The physical book is perishable—say, a specific copy of the Old Testament—but its contents are not, as they can be expressed into another physical book.

 

‹ Prev