The Knockoff Economy

Home > Other > The Knockoff Economy > Page 7
The Knockoff Economy Page 7

by Kal Raustiala


  Yet there is good reason to be skeptical that copying is much quicker now than it was in the past. Indeed, for a very long time the copying of fashion designs has been easy and fast. Well before Al Gore invented the Internet, ordinary photos combined with a fax machine allowed copyists to begin work within hours of photographing or sketching the original. Even before that, transcontinental air travel allowed designs to be copied in a matter of days, and for designs produced domestically, less than a day. Time magazine wrote in 1936 that “by the early Depression years [copying] had gone so far that no exclusive model was sure to remain exclusive 24 hours; a dress exhibited in the morning at $60 would be duplicated at $25 before sunset and at lower prices later in the week.”72

  So very fast copying is actually many decades old. And people have been predicting doom from fast copying for just as long. Back in 1940, a Harvard Business Review article about design piracy noted that fashion producers have, “within the past 50 years,” been complaining “that modern, high-speed methods of communication” have made copying much quicker and therefore more harmful to originators.73 If the industry moved that fast during the 1930s and 1940s, the speed of copying has hardly changed meaningfully in the decades since. In fact, even the claim that copyists are beating originators to market, which sounds like an artifact of our whiz-bang Internet-meets-Globalization 21st century world, is old news. Close observers of the industry were sounding the alarm about this alleged peril when Herbert Hoover was just starting his run for the presidency.74

  In short, claims about the impact of fast copying sound grave and new but have been around a very long time. Fast copying is old hat. And in the intervening (many) decades, the American fashion industry has grown enormously successful. The bottom line is that there doesn’t seem to have ever been a golden age of first-mover advantage, in which designers could reap the benefits of originality before copyists swooped in. Given that, it is hard to believe that first-mover advantage explains why copying doesn’t kill creativity in fashion.

  Nor, by the way, is there much evidence that fast copying causes serious harm to the fashion industry overall. If fast copying is a really serious problem, it is hard to explain the industry’s commitment to its runway schedule. A lot of new fashion designs debut in the major spring and fall shows in New York, Paris, Milan, and London. The “spring” shows are actually held the preceding fall. And the “fall” shows the preceding spring. If first-mover advantages were crucial, and rapid copying deadly, we would not expect to see such a significant lag between the runway and retail. The industry would rely more on secrecy and speed to protect first-mover advantage. That the major players do not suggests that first-mover advantage is not the mainspring of innovation incentives for the industry as a whole.

  Still, first-mover advantage probably does play some role in the creative success of the apparel industry. But it is more important to consumers than to producers. What do we mean?

  It may be easy to copy a design quickly, but that doesn’t mean the public is going to buy quickly. There is a lag as consumers figure out what they like and what is trendy that season. The lag that exists between debut and diffusion allows early adopters to differentiate themselves from the crowd. Fashionistas adopt a new design, and then, in a few cases, it begins to spread. The freedom to copy a design that is becoming hot helps to create a trend and, ultimately, expand the market for that design. For this process to unfold, however, the design must become hot in the first place. And as a practical matter, styles become hot over time, not instantly (and of course most remain very cold). In short, some lag between creation and widespread copying almost always exists, and this lag is an essential element of the piracy paradox. But that is not the same as saying the reason fashion designers stay creative in the face of copying is because they have a meaningful first-mover advantage.

  In sum, the trick to being a successful fashion copyist isn’t just copying. It’s copying winners. And that almost always requires waiting.

  CONCLUSION: SHOULD DESIGNERS KNOCK OFF THEIR OWN DESIGNS?

  The paradoxical effects of piracy in fashion have an interesting implication. If free and legal copying propels the fashion cycle forward ever faster, leading to more rapid turnover in styles and more sales, why do individual designers leave the copying to others? In other words, shouldn’t designers knock off their own designs? The more designs diffuse, after all, the quicker they die—and hence the quicker new designs can debut.

  Indeed, some have provocatively suggested that smart firms ought to give away cheaper, visibly inferior versions of their products.75 We are not aware of anyone who gives away bad versions of their clothes. And there is a good reason. Brand protection—the desire by trademark owners to maintain the exclusivity of their very valuable high-end marks—stops this from occurring in the real world. But we do see a careful version of self-knockoffs: what insiders call “diffusion” or bridge lines.

  Diffusion lines are clothes by a famous designer that sell at lower price points under a distinctive but related label. A good example is Marc by Marc Jacobs—clearly identifiable as something designed (supposedly) by Marc Jacobs, but not the same as the top end Marc Jacobs. To be sure, using a well-respected trademark at multiple price points runs the risk of diluting the value of that mark. While some fashion insiders stress the danger of these lines blurring a brand’s identity and tarnishing a mark—and cite the story of Halston, whose fall from grace and fortune was dramatic after he tried marketing clothes under his name to the masses at JC Penney—many well-known design houses have a second or even third line that is lower priced. One way to understand the phenomenon is precisely as a strategy to knock off one’s own signature designs, so that consumers who can’t afford the real thing can at least get a piece of it—and that desirable label.

  A very prominent user of this strategy is Giorgio Armani, which has up to five distinct lines, depending on how one counts. Most fashion firms, however, do not go this deeply into the diffusion world. Why the Armani approach is not more common is an interesting question. But it is clear that at least some degree of self-copying occurs throughout the industry.

  We suspect the reason true self-knockoffs—in which the same basic design is offered at different prices—are rare revolves around the great power of trademarks. Tarnishing a brand is perilous, as Halston and Pierre Cardin taught the fashion world. Moreover, different labels even within the same house—Armani Exchange, Armani white label, and so forth—have different identities, and it is dangerous to blur them. Customers at the high end will not appreciate it; it is one thing to see a favored designer’s work knocked off by Forever 21, and another for the designer to do it herself. Given this, fashion firms are wisely cautious. It is usually better to have someone else do the copying.

  Let us return to where we began. Fashion is a huge industry in which imitation is everywhere—and completely legal. The fashion world ought to be in an economic freefall, since our conventional view of innovation tells us that widespread copying destroys creativity and kills markets. Yet the apparel industry is not just surviving—it is thriving. Extensive and legal copying accelerates the fashion cycle, banishing once-desired designs to the dustbin of apparel history (perhaps later to be dusted off and reintroduced) and sending the fashion-conscious off in search of the new, new thing. And copying allows trends, the cornerstone of contemporary fashion, to develop and spread. The result is an American fashion industry that is dynamic, innovative, successful, and full of copying.

  2

  CUISINE, COPYING, AND CREATIVITY

  In the spring of 2007 a chef named Ed McFarland opened a restaurant on Lafayette Street in downtown Manhattan called Ed’s Lobster Bar. For years McFarland had been the sous-chef at the very successful Pearl Oyster Bar on Cornelia Street in Greenwich Village. Pearl Oyster Bar was a small place, but it was well known and always packed. The chef and owner, Rebecca Charles, had built an avid following based on a simple formula: a short list of excellent seafood, elega
nt but spare New England coastal décor, a signature Caesar salad with English muffin croutons, and plenty of oyster crackers on the tables.

  Eventually Ed McFarland sought to strike out on his own, and when he did, he took with him a lot of ideas drawn from his years working at the Pearl Oyster Bar. At least, so claimed Rebecca Charles. Shortly after Ed’s Lobster Bar opened less than a mile from Pearl, an angry Charles filed suit in the federal court in lower Manhattan. In her suit she claimed that McFarland had “pirated Pearl’s entire menu; copied all aspects of Pearl’s presentation of its dishes; [and] duplicated Pearl’s readily identifiable décor.”1 According to Charles, Ed’s Lobster Bar was “a total plagiarism” of her well-known restaurant. Perhaps most galling to Charles was the Caesar salad. When she taught McFarland how to make her signature Caesar salad, she told him “you will never make this anywhere else.”2 Ed’s Lobster Bar menu nonetheless featured a Caesar salad somewhat tauntingly dubbed “Ed’s Caesar.”

  McFarland saw things differently. “I would say it’s a similar restaurant,” he told the New York Times. “I would not say it’s a copy.” McFarland pointed to some differences between the two establishments. Ed’s Lobster Bar, he asserted, was “more upscale… a lot neater, a lot cleaner, and a lot nicer looking.” Moreover, Ed’s had a skylight (Pearl had none) and a raw bar (though at the time, and still today, Pearl served oysters and clams on the half shell, as well as a shrimp cocktail). McFarland also noted that much of the décor in Pearl Oyster Bar was in fact common to seafood bars in New England, which was the ostensible homeland for the menu and the design of both Pearl Oyster Bar and Ed’s Lobster Bar.* Nonetheless, it was undeniable that Ed’s Lobster Bar looked a lot like Pearl Oyster Bar. It was casual but crisply designed, with a long and narrow room and a bar as the centerpiece. The menu looked quite similar too, though Ed’s, as befitting its name, featured lobster more prominently.

  The suit between Rebecca Charles and Ed McFarland was eventually settled out of court. But the issues it raised continue to vex the culinary community. What rights does a chef have to her creations? What makes a dish original? When does homage cross over to theft?

  These questions, and others like them, are unsettled, but the stakes are not small. Like the fashion industry, the restaurant industry is very large—sales at American dining establishments alone were estimated at nearly $604 billion in 2010.3 Also like fashion, the world of cuisine features extensive imitation—call it borrowing, copying, or, if you prefer, piracy. And, in a situation similar again to fashion, for the most part American law grants chefs very limited rights over their creations. For all practical purposes recipes, no matter how original, cannot be copyrighted. So while a cookbook can be copyrighted as a whole, the individual recipes can be borrowed and republished by anyone—as a brief tour of the Internet, and popular cooking Web sites like Epicurious, will make clear.

  Perhaps more important, the “built food”—the edible dish itself—cannot be protected either. However good Rebecca Charles’s Caesar salad is, there is nothing in the law that stops the Ed McFarlands of the world from reproducing it. Anyone can taste a dish they like, apply their expertise to reverse-engineer it (by recognizing the taste and appearance of primary ingredients and reconstructing the steps taken to prepare them), and then recreate it elsewhere, including in a competitor restaurant. As any connoisseur of good food knows, this kind of copying happens all the time.

  The contemporary culinary scene is nonetheless astonishingly creative. Globalization has brought us an ever-expanding palette of new ingredients from around the world, and made them ever more affordable. And new cooking techniques, such as those pioneered by the “molecular gastronomy” or “modernist cuisine” movement, abound. It’s no wonder that new dishes are invented and refined every day. In many respects we are living in a golden age of cuisine, with more choices and more creativity than ever before.

  In short, cuisine presents much the same puzzle as fashion. How do chefs remain so creative while enjoying so little legal protection for their core product? Why doesn’t the mainstream view of copying—that it will squelch creativity—seem to apply in the kitchen?

  A VERY BRIEF CULINARY HISTORY

  For millennia, chefs throughout the world have labored to create delicious food. Yet for most of that history they labored in obscurity. In the West, only in the nineteenth century did a few great chefs, like Antoine Careme and Auguste Escoffier, achieve a public persona and some measure of fame. For many decades after these pioneers first entered the public eye, chefs were rarely treated as artists on par with their peers in the visual or literary arts, and for the most part restaurants, well into the 20th century, hardly noted their chefs. Today, of course, star chefs seem to be everywhere. Food lovers follow chefs from restaurant to restaurant, famous “consulting chefs” license the use of their names to kitchens that they may have never entered, and the Food Network has spawned an entire industry of chef contests and celebrities. The New York Times dining section documents the comings and goings of chefs as if they were baseball stars traded from team to team.

  While countries such as France, Italy, and China have ancient and storied food traditions, for much of its history the United States lacked a robust culinary culture. Local cuisines have long flourished in obvious places like New Orleans, as well as in less obvious spots, such as the low country of South Carolina, where traditional ingredients and dishes were passed down and cherished. But for the most part it is fair to say that compared to Europe, the United States was a culinary wasteland for a very long time. In his engaging history of contemporary American food culture, The United States of Arugula, David Kamp recounts the story of James Fenimore Cooper, the novelist, returning in 1833 from several years in France. Commenting on the differences between the French and American diets, Cooper called Americans “the grossest feeders of any civilized nation ever known,” and a people who subsisted on a “heavy, coarse, and indigestible” diet.4 While by the Gilded Age of the late 19th Century it was clear that the rich in the major cities ate very well—think Diamond Jim Brady and his Brobdingnagian feasts of oysters, terrapin, and roast duck—there was no mass culture of food appreciation in the United States. Fine dining in restaurants existed in places like New York, but until the end of the Second World War most Americans seldom ate in restaurants.

  This began to change by the middle of the 20th century. The 1939 World’s Fair in Flushing Meadows, New York, was arguably the birthplace of contemporary fine French dining in the United States; it was the source of Henri Soule’s Le Pavillon restaurant in Manhattan, which, until its closure in 1971, was the sun around which postwar haute cuisine revolved in New York—as well as the training ground of many top chefs.5 In the same era, the influential chef and food writer James Beard nurtured a growing appreciation of traditional American cooking and ingredients. The burgeoning American interest in fine cooking was exemplified by, and stoked by, Julia Child’s television show The French Chef, which debuted in 1963 and quickly became a cultural icon. The nation, increasingly richer and blessed with more leisure time, embraced cooking as a pastime and even a passion.

  By the 1970s a new wave of chefs, both in New York and in California (as well as in France), was redefining fine cuisine, emphasizing local ingredients, lighter treatments, and more casual service. Americans in due course discovered hitherto-exotic provisions such as goat cheese, baby greens, and sundried tomatoes. Lawyers Nina and Tim Zagat introduced their populist (in the sense that they reflected the views of many discerning customers, rather than a single critic) restaurant guide in 1979, as Americans began eating out in greater and greater numbers. By the 1980s a full-blown culinary revolution was taking place, led by names that are now well known: Wolfgang Puck, David Bouley, Danny Meyer, Alice Waters. Slowly, chefs were becoming celebrities and restaurants a site of art appreciation on par—in the view of many—with the museum and the opera house. Even the US Department of Labor took note of these shifts, changing their classification o
f chefs from “domestics” to “professionals” in 1976.6

  The “chef revolution” of the late 20th century coincided with, and was driven by, changes in how great cooking was understood and evaluated. Creativity has always been a part of fine cooking, alongside skillful and precise preparations of time-honored classics. But increasingly, culinary reputations were being made by innovative and bold new dishes. A well-known example is Wolfgang Puck’s much-imitated smoked salmon pizza. Pizza had been around for a long time, but it took Puck, an Austrian working in tradition-flouting Los Angeles (and to a large degree—perhaps a very large degree—working with the assistance of his original Spago pizza-man, Ed LaDou) to create something truly new in the pizza world.* Puck’s success spawned a rash of imitators, and pizza has never been the same. Along the way Wolfgang Puck became a very rich and famous man.

  In short, from the 1960s onward, and especially during and after the 1980s, American food culture underwent a remarkable flowering. Across the board, a wealthier and more time-starved nation increasingly chose to eat out rather than cook in. To be sure, Americans overwhelmingly ate at fast-food restaurants, or at one of the thousands of simple Chinese take-out joints that dot the continent.7 (There are more Chinese restaurants in the United States than there are McDonald’s.) In parallel, however, the nation developed a more sophisticated restaurant scene, along with an increasingly food-knowledgeable populace that yearned to eat innovative, challenging cuisine.

  Today, it is not too much of an exaggeration to say we live in an unprecedentedly food-centered nation—not for everyone, to be sure, but for a large slice of affluent Americans. For these fortunate people, the search for creative and unusual meals is a way of life. Food is now art as well as sport.

 

‹ Prev