by Andrew Keen
Makers of the world unite!
We’ve heard this kind of nonsense before. We heard it from Anderson’s first two books, The Long Tail and Free, which got our abundant future entirely wrong. We heard it from Web 1.0 revolutionaries like Kevin Kelly, who promised us that the Internet would reinvent the traditional rules of business by giving away everything for free. We heard it in Dougherty’s Web 2.0 revolution, too, when we were promised that everybody could become an online writer or musician. But, of course, rather than democracy and diversity, all we’ve got from the digital revolution so far is fewer jobs, an overabundance of content, an infestation of piracy, a coterie of Internet monopolists, and a radical narrowing of our economic and cultural elite.
After Dougherty had showed me around the Maker Media office with its collection of cutting-edge 3-D printers spewing out their fully finished products, we sat down to talk. The jobs question was foremost on my mind. After all, if everyone in the future has a manufacturing facility on their desktop, then what happens to the tens of millions of people around the world already employed in industrial factories?
“So, Dale, what about employment?” I asked Dougherty, who, in contrast with many Silicon Valley entrepreneurs, is willing to acknowledge the darker side of the networked revolution. “If these machines make everything we need, then what will everyone do all day?”
The founder and CEO of Maker Media shuffled nervously in his chair. He was far too honest and intelligent to feed me some nonsense about a long tail of networked digital makers all trading 3-D selfies or unborn babies with each other in an idyllic global village. His silence was illuminating. Yes, Dougherty has once again seen the future before everyone else. But is this a future that any of us—especially Dale Dougherty—really wants?
Take, for example, the impact of the maker’s 3-D printing revolution on the fashion industry. Back in the early twentieth century, my great-grandfather, Victor Falber, bought fabric from the woolen mills that he then sold in Berwick Street market to people who made their own clothes. This Maker 1.0 economy was replaced in the mid-twentieth century by the Maker 2.0 model of mass-produced, off-the-rack clothing—the Oxford Street of retailers like the Gap, American Apparel, Esprit, and Next. And over the next quarter of a century, as the Maker 3.0 revolution of 3-D printing begins to popularize the manufacturing of personalized clothing, these stores may—like Oxford Street’s 60,000-square-foot HMV music emporium—become redundant.
The Silicon Valley hype machine is beginning to identify the world’s fashion industry as the source of the next great disruption. “Why 3D printing will work in fashion,” one TechCrunch writer glibly reveals.64 “3D-printed fashion: off the printer, rather than off the peg,” is how the Guardian’s Alice Fisher describes the so-called democratization of fashion in which we can all design our own personalized clothing. “It could revolutionize garment sizing and product development in mass production,” Fisher promises. “It could also allow startup labels to produce small orders to avoid unsold stock, and allow easy customization.”65
But weren’t we promised this same customized, personalized, democratized cornucopia in the nineties about the music industry? And didn’t this dream degenerate into the sour reality of a blockbuster-dominated, advertising-saturated industry that has been cut in half by the free and pirated content of Internet startups like Napster?
“Will 3-D printing upend fashion like Napster crippled the music industry?” asked Mashable’s Rebecca Hiscott.66 Yes, it will, I’m afraid. Like recorded music, all the economic value of fashion lies in its intellectual property. So when the design of new clothing is readily available, legally or otherwise, on the Internet and anyone can make personalized dresses or shirts on their 3-D printer, how will professional designers and clothing companies be rewarded for their creative labor? If the case of the demise of the recorded music industry is anything to go by, they won’t. Perhaps fashion designers will have to fall back on Lady Gaga’s Dorito business model of finding sponsors to pay for her performances. Or perhaps in today’s selfie-centric culture, fashion designers will give out their work for free in exchange for us freely advertising them online. In 2014, the hip online clothing retailer ASOS introduced the #AsSeenOnMe promotion, which posts Instagram or Twitter selfies of ourselves wearing ASOS clothing. “Monetize me,” Contagious’s Katrina Dodd characterizes this strategy that does, quite literally, turn us into a free models for ASOS fashions.67
In May 2014, I spoke alongside the science fiction writer Bruce Sterling at a Wired magazine conference in Milan, the center of Italy’s high-end fashion industry. Sterling, who is refreshingly critical of big data Internet companies like Facebook and Google, advised the audience of local designers to explore what he called “open source luxury” as a way of circumventing Silicon Valley toll booths. But while I’m a great admirer of Sterling’s sparkling imagination, it’s hard to see the business model of open-source luxury. Unlike software, design and creativity can’t be viable given out as a free license. Luxury will, by definition, always be a closed system. Once Fendi, Gucci, or Armani give out their source code for free, they become commodities with the same lack of value as the Internet’s cornucopia of online content.
Silicon Valley will, no doubt, profit from this Makers 3.0 revolution. As the investor Fred Wilson predicts, there will inevitably be a winner-take-all YouTube or Uber-style platform—something like the website Etsy, which is already aggregating buyers and sellers of handmade goods and taking a cut of every transaction. But, to paraphrase Robert Levine, the real conflict in the networked maker’s economy is likely to be between the fashion companies that design our clothes and the technology firms that want to distribute their designs—“legally or otherwise.”
“We are at the dawn of the age of sharing where, even if you try to sell things, the world is going to share it anyway,” argues Bre Pettis, the CEO of MakerBot, about the maker’s “sharing” economy.68 But Pettis’s notion of “sharing” is a euphemism for theft and the consequences of his maker’s revolution threaten to be even more damaging than those that decimated the music industry. After all, if anyone can copy any design and make their personalized clothing on home 3-D printers, then what becomes of the millions of workers who are employed in garment factories around the world? Some Western liberals might celebrate the end of the sweatshop. But I’m not sure the millions of people who work in the garment industry would agree.
The “age of sharing” evangelists such as Pettis would, no doubt, accuse me of being a Luddite. But there is no shame in sometimes questioning the economic impact of technology on society. As Paul Krugman reminds us in “Sympathy for the Luddites,” twenty-first-century digital technology is, like it or not, displacing and devaluing educated workers.69 Krugman uses the example of the eighteenth-century woolworkers of Leeds, who questioned the destructive impact of mechanized technology on their jobs, as being an inspiration to us today. As the economic historian Eric Hobsbawn notes, it was at these textile mills in the north of England that the global industrial revolution was born. These were the same mills that produced the cloth that my great-grandfather sold in Berwick Street market in the early twentieth century. And they are the same kind of mills that will be radically “disintermediated” in the makers’ 3.0 economy.
Victor Falber had a nephew named Reuben Falber. I always knew him as Uncle Reuben, but the world remembers him quite differently. Reuben Falber was the longtime assistant secretary of the Communist Party of Great Britain, who was exposed after the fall of the Berlin Wall and the collapse of the Soviet Union as the official responsible for laundering large amounts of Soviet cash into Britain to finance the communist revolution.70 But my own memories of Reuben Falber were of a scholarly man who would use quotes from Marx to explain to me why the collapse of capitalism was inevitable.
One of his favorite quotes was from Marx’s 18th Brumaire of Louis Bonaparte. “Men make their own history, but they do not make it just as they please,” my uncle Reuben liked
to remind me about the supposedly greater historical forces controlling our destiny. But with 3-D printing, I fear, we will eventually be able to make anything that we please. Everything, that is, except the illusion of our own histories.
CHAPTER SEVEN
CRYSTAL MAN
The Ministry of Surveillance
If we really do make our own histories, then who exactly made the Internet? Technology historian John Naughton claims it was the RAND telecom engineer Paul Baran. TCP/IP inventors Bob Kahn and Vint Cerf say they created it. Others award the honor to “As We May Think” author Vannevar Bush or to J. C. R. Licklider, the “Man-Computer Symbiosis” visionary who dreamed up the Intergalactic Computer Network. More literary types even suggest that the Argentine writer Jorge Luis Borges, the author of stories like “The Library of Babel” and “Funes the Memorious,” about “infinite libraries and unforgetting men,”1 imagined the Internet before anyone else.
Then, of course, there is Albert Arnold “Al” Gore Jr. “During my service in the United States Congress, I took the initiative in creating the Internet,” Gore told CNN’s Wolf Blitzer in March 1999. And if I had a dollar for every bad Al-Gore-invented-the-Internet joke, I could probably afford to be Trevor Traina’s neighbor up on Billionaire Row. But, of course, Gore didn’t invent the Internet, consciously or otherwise, even though the former American vice president has personally profited so massively from what he called the “information superhighway” as an Apple board member, Google advisor, partner of John Doerr and Tom Perkins at KPCB, and cofounder of CurrentTV, that he could, no doubt, afford a $37 million house or two of his own up in San Francisco’s ritziest neighborhood.
But there is one politician who, in contrast with Al Gore, could really claim to have stumbled upon the idea of the Internet almost before anyone else. This politician pioneered a computer network that aggregated all of his country’s educational, medical, financial, and other personal records into a single database. Like big data companies such as Google and Facebook, the goal of this unforgetting politician was to amass so much information about us that he would know us all better than we know ourselves. And like Tim Berners-Lee, this politician was developing his revolutionary idea in 1989.
His name is Erich Mielke and, between 1957 and 1989, he was the head of the East German secret police, the Stasi. I’m half joking, of course, about it being Mielke, rather than Al Gore, who invented the Internet. Although, unlike the amusingly self-important Gore, there’s nothing even vaguely funny about this East German communist politician whose ubiquitous secret police transformed an advanced industrial country into a surveillance camp.
In 2010, Google CEO Eric Schmidt boasted that Google was so familiar with all our habits that it automatically knew where we are and what we’ve been doing. But twenty-five years before Schmidt boasted about Google’s data omniscience, Mielke began to develop a similarly ambitious project for a comprehensive database of human intentions. The idea was born in the spring of 1985. Erich Honecker, the head of the East German Communist Party, who liked to think of his country as more technically advanced than the capitalist West, even though its main business model and source of foreign currency was selling its citizens to West Germany, wanted to “start collating computerized files and reports” on all sixteen and a half million people in East Germany.2 The historian Victor Sebestyen describes this as a “computerized snooping system.” Its intention was to digitize the 39 million index cards and 125 miles of documents containing more than 2 billion sheets of paper.3 The goal was a computer system that knew everything about everyone in the country.
By the mideighties, Mielke’s Stasi had become the largest company in East Germany, employing around 100,000 full-time snoops and at least another half a million active informers. According to Stasiland author Anna Funder, Mielke’s organization might have turned as many as 15% of all East Germans into one kind of data thief or another.4 Known as “the Firm” to East Germans, Stasi was attempting to transform the whole of East Germany into a real-time set of Rear Window. The country was, as Big Data authors Viktor Mayer-Schönberger and Kenneth Cukier note, “one of the most comprehensive surveillance states ever seen.”5 Like Ted Nelson’s Xanadu project to develop hypertext, Mielke’s East Germany eliminated the concept of deletion.
“We had lived like behind glass,” explained the novelist Stefan Heym. Mielke organized his society around the same kind of brightly lit principles that the architect Frank Gehry is now using to build Facebook’s new open-plan office in Silicon Valley. Mark Zuckerberg—who once described Facebook as a “well-lit dorm room” in which “wherever you go online you see your friends”6—describes this multimillion-dollar Gehry creation as “the largest open office space in the world.” Gehry’s “radically transparent” building will be without internal walls, floors, or private offices, even for senior Facebook executives. Its purpose, Zuckerberg explains, is to make “the perfect engineering space.”7 But Gehry’s office is an architectural metaphor for Zuckerberg’s cult of the social: a well-lit place where not only can you see your friends, but your friends—especially, of course, the autistic Facebook founder—can see you.
Mielke amassed personal data with the same relentlessness that Google’s Street View car collected the emails, photos, and passwords of online German citizens between 2008 and 2010—a privacy breach that Johannes Caspar, the German regulator in charge of the investigation into Google’s behavior, described as “one of the biggest data-protection rules violations known.”8 But, as a violator of our online data, Google faces stiff competition from its rival Facebook. TechCrunch’s Natasha Lomas suggests that Facebook’s “creepy data-grabbing ways,” such as the 2013 harvesting of the personal contact information of 6 million of its users, or that secret 2012 study to control the emotions of 689,000 of its users,9 make it the “Borg of the digital world.”10 WikiLeaks founder Julian Assange, who knows a thing or two about spying himself, even accuses Facebook of being the “greatest spying machine the world has ever seen.”11
So is Facebook really the greatest spying machine in world history—greater than either the Stasi, the CIA, or Google? Citing Google’s Street View car privacy violations, the German privacy regulator Johannes Caspar might doubt Assange’s assertion, as probably would privacy watchdogs in the United Kingdom, Germany, and Italy who collectively told Google in the summer of 2013 that the company would face legal sanctions unless it changed its 2012 policy of unifying personal data collected from all its different services.12 Others would also award this dubious honor to Google. Such as those plaintiffs who, in a July 2013 California court case, claimed that “Google uses Gmail as its own secret data-mining machine, which intercepts, warehouses, and uses, without consent, the private thoughts and ideas of millions of unsuspecting Americans who transmit e-mail messages through Gmail.”13 Or the millions of students whose mail messages were allegedly surreptitiously captured by Google to build advertising profiles of them—a “potentially explosive” lawsuit, according to Education Week magazine, that is also currently being heard in US federal court.14
In any case, before Facebook and Google, there was Erich Mielke’s twentieth-century Stasi. Mielke was originally against Honecker’s vision of digitizing all the Stasi’s analog records. But by August 1989, as protests against the communist regime intensified, he gave the order to begin the digital collation of information of every East German citizen. Officially known as the “Regulation for the Use of Stored Data,” it sought to collect all personal data from the country’s legal institutions, banks, insurance agencies, post offices, hospitals, libraries, and radio and television companies. According to the East Germany historian Stefan Wolle, Mielke was particularly enthusiastic about the “complete interconnectedness” of this data project.15 Rather than socialist man, Wolle says, Mielke wanted to create crystal man (“der gläserne Mensch”). Mielke’s goal was to build a place where everyone lived “behind glass” and nobody could escape his electronic gaze.
In contras
t, however, with the Internet, Erich Mielke’s “Regulation for the Use of Stored Data” project was never realized. The Wall fell in November 1989 and he was arrested in 1990, imprisoned in 1993, and died in 2000. But Mielke’s work has been memorialized in the old Berlin headquarters of the Stasi, which has been transformed into a museum displaying the technologies of surveillance that he used to snoop on the East German people.
The former East German Ministry for State Security is located on a particularly gray, nondescript street near the Magdalenenstrasse U-Bahn station, a few subway stops away from the center of Berlin. It’s not too far from the US embassy on Pariser Platz, where, the American whistle-blower Edward Snowden revealed, the NSA had a spy hub that monitored the cell phone calls of German chancellor Angela Merkel16—a privacy breach so angering Merkel that the chancellor, who grew up in East Germany, compared the snooping practices of the NSA to the Stasi’s.17 Nor is it a great distance from the British embassy beside the Brandenburg Gate, where, according to documents leaked by Snowden, the British intelligence agency GCHQ was running its own separate spying operation on the German government.18
The gray old Stasi headquarters in Berlin, permanently frozen now in its 1989 appearance, is defiantly analog. The Stasiplex certainly is no Googleplex. There’s nothing high-tech about either the office’s dingy, cramped rooms or the electric typewriters, rotary telephones, and primitive switchboards on all of its desks. In spite of Mielke’s order to network its information, much of Stasi’s data in 1989 was still confined to handwritten or typed index cards. The museum even has an index card on display written by Mielke’s secretary explaining what the Stasi chief liked to eat for breakfast.