The Internet Is Not the Answer
Page 18
But for all its millions of meticulously transcribed index cards, East Germany’s former ministry of surveillance is as much an introduction to the digital future as it is a look back at the analog past. As a museum displaying how technology was used to acquire other people’s data, the Stasi museum—in our age of data-hungry multinationals like Google and Facebook and big data agencies like NSA and GCHQ—has contemporary relevance. Like the seemingly insatiable thirst of both contemporary government intelligence organizations and Internet companies for our most intimate information, the Stasi’s appetite for data was astonishing. The museum’s exhibits include such data-gathering technologies as miniature cameras hidden in pens, tie pins, and neckties. It has several rooms dedicated to showing off locally made Zeiss-Jena cameras concealed in handbags, briefcases, purses, and thermos. One exhibit even features a watering can with a hidden camera near its spout.
Yet there is one thing about the Firm that distinguishes it from Internet companies like Google or Facebook. To borrow a popular Silicon Valley term, Erich Mielke’s operation didn’t “scale.” Mielke was a twentieth-century information thief who may have transformed East Germany into a data kleptocracy. But compared with twenty-first-century data barons, he still was thinking too small, too locally about his information empire. It never occurred to him that billions of people globally might freely give up their most personal data. And he didn’t understand that there were much more scalable strategies for aggregating people’s photos than by disguising cameras inside watering cans.
No, to create a truly global crystal man, it wasn’t enough to put a hundred thousand spies on your payroll and have 39 million handwritten index cards. On the Internet, an electronic world of increasingly intelligent connected machines that Nicholas Carr calls a “Glass Cage,” there are billions of “gläserne Menschen.” And, it seems, they are all willing to work for free.
The Eyes of the Venetian
In Las Vegas, there isn’t a casino built around the theme of either East Germany or the Berlin Wall, surprisingly enough. But Las Vegas does possess one entertainment complex that pays homage to another of history’s great spying machines—the Venetian Republic, which, in its fifteenth- and sixteenth-century heyday, was notorious for its dense network of spies working for the State Inquisitors, a panel of judges that was a late medieval version of the Stasi. So it was serendipitous that a part of the 2014 Consumer Electronics Show (CES), the world’s largest event dedicated to networked consumer devices, was held in Las Vegas’s version of Venice—the Venetian Resort Hotel Casino. Situated on Las Vegas’s strip, the Venetian, with its gaudily inauthentic piazzas and canals, represents a version of the Italian city-state that might be charitably described as augmented reality.
At CES 2014, surveillance technologies were, so to speak, on show throughout the Venetian. Companies were demonstrating networked cameras that could do everything from peeping under walls and peering around corners to peeking through clothing. It was like being at a conference for spooks. At the Indiegogo-sponsored section of the show, hidden in the bowels of the Venetian, one crowd-financed startup from Berlin named Panono was showing off what it called a “panoramic ball camera,” an 11 cm electronic ball with thirty-six tiny cameras attached to it, that took panoramic photos whenever the ball was thrown in the air and then, of course, distributed them on the network. Another Indiegogo company, an Italian startup called GlassUP, was demonstrating fashionably designed glasses that—like Google Glass—recorded everything they saw and provided what it called a “second screen” to check emails and read online breaking news. There were even “Eyes-On” X-ray style glasses, from a company called Evena Medical, that allowed nurses to see through a patient’s skin and spy the veins underneath. Just about the only thing I didn’t see in the Venetian were cameras hidden inside watering cans.
There were electronic eyes everywhere one looked. There was even an entire exhibition dedicated to intelligent eyeglasses. This “Retrospective Exhibition: 35 Years of Augmented Reality Eyewear,” a kind of history of the future, was held inside the “Augmented Reality Pavilion” in the Venetian. It featured row upon row of plastic heads, all wearing augmented glasses that had been developed over the last thirty-five years. The exhibition was sponsored by two of today’s leading developers of augmented glasses—an Israeli firm called OrCam, and Vuzix, whose $1,000 Smart Glasses M100 model, the world’s first commercially available networked eyewear, feature a hands-free camera that can record everything it sees. “Unforgettable” or “Public Eye” might have been more appropriate names for Vuzix’s creepy surveillance glasses.
“Welcome to Infinite Possibilities,” one banner hanging at the Venetian proclaimed in welcoming CES attendees. “Living in Digital Times: Connecting Life’s Dots,” another announced about a networked world in which the volume of data produced between 2012 and 2013 made up 90% of all the data produced in human history.19 In 2012, we produced 2.8 zettabytes of data, “a number that’s as gigantic as it sounds,” according to data expert Patrick Tucker, and by 2015 that number is expected to double to over 5.5 zettabytes. To put this into perspective, in 2009 all the content on the World Wide Web was estimated to have added up to about half a zettabyte of data.20
But, rather than infinite, the possibilities of most of the new electronic hardware at CES 2014 were really all the same. They were all devices greedy for the collection of networked data. These devices, some of which were being crowdfunded by networks like Indiegogo and Kickstarter, were designed to connect our dots—to know our movements, our taste, our physical fitness, our driving skills, our facial characteristics, above all where we’ve been, where we are, and where we are going.
Wearable technology—what the Intel CEO Brian Krzanich in his keynote speech at the show called a “broad ecosystem of wearables”—dominated CES 2014. Sony, Samsung, and many, many startups were all demonstrating products that wouldn’t have been out of place at that old East German Ministry for State Security in Berlin. Two of the most hyped companies producing so-called quantified self products at CES were Fitbit, the maker of a wrist device that tracks physical activity and sleep patterns, and Swedish-based Narrative, the manufacturer of a wearable tiny camera clip designed to be worn on a lapel that automatically takes photos every thirty seconds and is known as a “lifelogging” device for recording everything it sees.
“What’s interesting about both companies is they make the invisible part of our lives visible, in an ambient ongoing fashion,” explained one venture capitalist who’d invested in Fitbit and Narrative.21 Thirty years ago, Mielke would have likely bought Narrative devices for the entire East German population. But today, the only bulk orders are likely to come from North Korea.
But it wasn’t just Fitbit and Narrative that were making the invisible visible. Everywhere at CES, companies were introducing surveillance products designed to spew our personal data. I judged a CES “hackathon” in which entrants innocently developed “innovative” new surveillance products, including hats and hoodies outfitted with sensor chips that instantly revealed the location of their wearer. A Canadian company, OMSignal, was demonstrating spandex clothing that wirelessly measured heart rate and other health data. Another smart clothing company, Heapsylon, even had a sports bra made of textile electrodes designed to monitor its wearer’s vital statistics.22
While Google wasn’t officially represented in the Augmented Reality Pavilion, there were plenty of early adopters wandering around the Venetian’s fake piazzas and canals wearing demonstration models of Google Glass, Google’s networked electronic eyeglasses. Michael Chertoff, the former US secretary of homeland security, described these glasses, which have been designed to take both continuous video and photos of everything they see, as inaugurating an age of “ubiquitous surveillance.”23 Chertoff is far from alone is being creeped out by Google Glass. Several San Francisco bars have banned Google Glass wearers—known locally as “Glassholes”—from entry. The US Congress has already launched an i
nquiry into their impact on privacy. And in June 2013, privacy and data officials from seven countries, including Canada, Australia, Mexico, and Switzerland, sent Google CEO Larry Page a letter expressing their discomfort about the impact on privacy of these glasses. Like Chertoff’s, their country’s fears were of a “ubiquitous surveillance”—a world in which everyone was being watched all the time by devices that are collecting massive amounts of our most personal health, location, and financial data.24
But it wasn’t only wearables that were on show at CES. To borrow the corporate language of Intel CEO Krzanich, it was the “broad ecosystem of life” that was being networked by all these new electronic devices spewing out those zettabytes of data that, according to Patrick Tucker, are now making anonymity impossible.25 The Internet of Things had arrived in Las Vegas. Quite literally, everything at CES was becoming networked, everything was being reinvented as a smart, connected device. There were smart ovens, smart clothing, smart thermostats, smart air conditioners, smart lighting systems, and smartphones, of course, all designed to capture data and distribute it on the network. One part of the show was dedicated to smart televisions—devices much more intelligent than most TV shows themselves. Indeed, South Korean electronics giant LG’s connected televisions are so intelligent that they are already logging our viewing habits in order to serve us up targeted ads.26
Another part of CES was dedicated to the connected car—automobiles that are so all-seeing they know our speed, our location, and whether or not we are wearing our seat belt. According to the consultancy Booz, the market for connected cars is about to explode, with demand expected to quadruple between 2015 and 2020 and generate revenues of $113 billion by 2020.27 But even today’s connected car is a data machine, with the onboard cameras from Mercedes-Benz’s new S-Class saloon already generating 300 gigabytes of data per hour about the car’s location and speed and the driver’s habits.28
And then there’s Google’s driverless car, an artificially intelligent, networked car that is driven by software called Google Chauffeur. The idea of driverless cars might sound as science fictional as the idea of augmented reality glasses—but Nevada and Florida have already passed laws permitting their operation and the sight of trial versions of Google’s automated cars driving themselves up and down Route 101 between San Jose and San Francisco is not an uncommon one. While there’s no doubt that driverless cars do have enormous potential benefits, particularly in terms of safety and convenience, not to mention the potential environmental benefits of much lighter and thus more energy-efficient vehicles, Google’s pioneering role in them is deeply problematic. The software that powers their cars, Google Chauffeur, is essentially the automotive version of Google Glass, a “free” product designed to track everywhere we go and to feed all that data back to the main Google database so that it can connect the dots of our lives. As the Wall Street Journal columnist Holman Jenkins notes about these so-called autonomous driverless vehicles, “they won’t be autonomous at all,” and they may “pose a bigger threat to privacy than the NSA ever will.”29 After all, if Google links the data collected from its driverless cars with data amassed from the rest of its ubiquitous products and platforms—such as the smartphone it is developing that uses 3-D sensors to automatically map our physical surroundings so that Google always knows where we are30—then you have a surveillance architecture that exceeds even anything that Erich Mielke, in his wildest imagination, ever dreamed up.
Tim Berners-Lee invented the Web in order to help him remember his colleagues at CERN. “The Web is more a social creation than a technical one,” he explains. “I designed it for a social effect—to help people work together—and not as a technical toy. The ultimate goal of the Web is to support and improve our weblike existence in the world. We clump into families, associations, and companies. We develop trust across the miles and distrust around the corner.”31
But when Berners-Lee invented the Web in 1989, he never imagined that this “social creation” could be used so repressively, both by private companies and governments. It was George Orwell who, in 1984, invented the term “Big Brother” to describe secret policemen like Erich Mielke. And as the Internet of Things transforms every object into a connected device—50 billion of them by 2020 if we are to believe Patrik Cerwall’s researchers at Ericsson, with five and a half zettabytes of data being produced by 2015—more and more observers are worrying that twentieth-century Big Brother is back in a twenty-first-century networked guise—dressed in a broad ecosystem of wearables. They fear a world resembling that exhibition at the Venetian in which row after row of nameless, faceless data gatherers wearing all-seeing electronic glasses watch our every move.
Big Brother seemed ubiquitous at the Venetian. Reporting about CES, the Guardian’s Dan Gillmor warned that networked televisions that “watch us” are “closing in on Orwell’s nightmarish Big Brother vision.”32 Even industry executives are fearful of the Internet of Things’s impact on privacy, with Martin Winterkorn, the CEO of Volkswagen, warning in March 2014 that the connected car of the future “must not become a data monster.”33
But there is one fundamental difference between the Internet of Things and Erich Mielke’s twentieth-century Big Brother surveillance state, one thing distinguishing today’s networked society from Orwell’s 1984. Mielke wanted to create crystal man against our will; in today’s world of Google Glass and Facebook updates, however, we are choosing to live in a crystal republic where our networked cars, cell phones, refrigerators, and televisions watch us.
The Panopticon
“On Tuesday I woke up to find myself on page 3 of the Daily Mail,” wrote a young Englishwoman named Sophie Gadd in December 2013. “That may be one of the worst ways to start the day, after falling out of bed or realizing you’ve run out of milk. My appearance was not the result of taking my clothes off, but the consequence of a ‘Twitter Storm.’”34
A final-year history and politics undergraduate at the University of York, Gadd had inadvertently become part of a Twitter storm when, while on vacation in Berlin, she tweeted a painting of the eighteenth-century Russian czarina Catherine the Great from the Deutsches Historisches Museum in Berlin. In her tweet, Gadd suggested that the face in the painting, completed in 1794 by the portrait painter Johann Baptist Lampi, had an uncanny resemblance to that of the British prime minister David Cameron.
“Within hours,” Gadd explains, “it had been retweeted thousands of times,” with the tweet eventually becoming a major news story in both the Daily Mail and the Daily Telegraph. “This experience has certainly taught me a few things about viral social media,” Gadd says, including the observations—which have already been made by many other critics, including Dave Eggers in The Circle, his 2013 fictional satire of data factories like Google and Facebook—that “the Internet is very cynical” and “nothing is private.”35
Gadd’s experience was actually extremely mild. Unlike other innocents caught up in an all-too-public tweet storm, she didn’t lose her job or have her reputation destroyed by a vengeful online mob or land up in jail. The same month, for example, that Sophie Gadd woke up to find herself on page 3 of the Daily Mail, a PR executive named Justine Sacco tweeted: “Going to Africa. Hope I don’t get AIDS. Just Kidding. I’m white!” Sacco published it as she was about to board a twelve-hour flight from London to Cape Town. By the time Sacco arrived in South Africa, she had only been retweeted three thousand times but had become such a source of global news that the paparazzi were there to snap her image as she stumbled innocently off her plane. Labeled the Internet’s public enemy number one for her stupid tweet, Sacco lost her job and was even accused of being a “f****** idiot” by her own father.36 Sacco will now forever be associated with this insensitive but hardly criminal tweet. Such is the nature and power of the Internet.
“When you only have a small number of followers, Twitter can feel like an intimate group of pub friends,” Sophie Gadd notes about a social Web that is both unforgetting and unforgiving.37
“But it’s not. It’s no more private than shouting your conversations through a megaphone in the high street.”
The dangers of the crystal republic predate George Orwell’s 1984 and twentieth-century totalitarianism. They go back to the enlightened despotism of Catherine II of Russia, the subject of Johann Baptist Lampi’s portrait hanging in Berlin’s Deutsches Historisches Museum, the David Cameron look-alike painting that had landed Sophie Gadd on page 3 of the Daily Mail.
The Italian-born Lampi hadn’t been the only late-eighteenth-century European to go to Russia to enjoy Catherine the Great’s largesse. Two English brothers, Samuel and Jeremy Bentham, also spent time there gainfully employed by Catherine’s autocratic regime. Samuel worked for Count Grigory Potemkin, one of Catherine’s many lovers, whose name has been immortalized for his “Potemkin villages” of fake industrialization he built to impress her. Potemkin gave Bentham the job of managing Krichev, his hundred-square-mile estate on the Polish border that boasted fourteen thousand male serfs.38And it was here that Samuel and his brother Jeremy, who joined him in 1786 in Krichev and is best known today as the father of the “greatest happiness” principle, invented the idea of what they called the “Panopticon,” or the “Inspection House.”
While Jeremy Bentham—who happened to have graduated from the same Oxford college as Tim Berners-Lee—is now considered the author of the Panopticon, he credits his brother Samuel with its invention. “Morals reformed—health preserved—industry invigorated—instruction diffused—public burthens lightened—Economy seated, as it were, upon a rock—the Gordian knot of the poor law not cut, but untied—all by a simple idea in Architecture!” Jeremy Bentham wrote triumphantly in a letter from Krichev to describe this new idea.