by Andrew Keen
The paradox is that Kodak has been a victim of abundance rather than scarcity. The more ubiquitous online photo sharing has become, the easier it’s become to take pictures from our smartphones and tablets, the less anyone has needed Kodak. “You press the button, we do the rest,” George Eastman famously boasted. But the digital revolution has made photography so easy that there is no longer any rest to do. And so, between 2003 and 2012—the age of multibillion-dollar Web 2.0 startups like Facebook, Tumblr, and Instagram—Kodak closed thirteen factories and 130 photo labs and cut 47,000 jobs in a failed attempt to turn the company around.43 And then, having emerged from Chapter 11 bankruptcy in 2013, Kodak committed suicide to avoid being murdered. Trying to reinvent itself as a “commercial imaging company serving business markets like packaging and graphics,”44 Kodak got out of the consumer picture business entirely. It was as if Kleenex suddenly stopped selling tissues or Coca-Cola withdrew overnight from the fizzy drinks business. Kodak sold its online photo-sharing site and its portfolio of digital imaging patents—what the New York Times described as its “crown jewel”45—to Silicon Valley vultures like Apple, Facebook, and Google that were eager to pick over the carcass.46 After all these self-inflicted cuts, there wasn’t much left of the company. By October 2013, only 8,500 people worked for Kodak.47 The game was up. Kodak was dead.
But Kodak—or, at least, its carcass—still existed in Rochester. And after driving around for a while, I did manage to find a company office. The building was at the intersection of State and Factory Streets in the old industrial part of town, a few blocks from the visitors center. KODAK: WORLD HEADQUARTERS, a dull corporate plaque advertised outside a sixteen-story building that, when it was constructed in 1914, had been the tallest place in Rochester. It was constructed from a similar sort of industrial brick that made up the old Musto marble factory in downtown San Francisco. But that’s all it had in common with the reinvented 58,000-square-foot Battery club. A desultory American flag flew outside this former skyscraper. On the corner of Factory Street, there was a row of stores that had all been shuttered. SAMBA CAFÉ: IT’S JUST SENSATIONAL, a faded neon sign claimed above a boarded-up Brazilian restaurant. FLOWER CITY, another derelict store advertised over a scratched-out sign that said JEWELERS.
This palimpsest of industrial life was a picture worth at least a thousand words. And so, parking my rental car in the building’s empty driveway, I took out my iPhone and began to photograph the desolation. The scene was so real that I didn’t even need to switch on the camera’s “noir” and “tonal” filters that had been designed to give my amateurish pictures more authenticity. But my shoot didn’t last long. After a few minutes, an ancient security guard shuffled out of the old brick building and told me that photography wasn’t permitted. I smiled sadly. No snaps allowed in Snap City. It was like outlawing email in Palo Alto or banning driving in Detroit.
I returned to the old woman at the empty visitors center. She brightened up when she saw me. “You’ll find people at the Eastman House,” she said after I described my people-less drive around Innovation Way and Creative Drive. “That’s the only reason anyone comes to Rochester.”
It was a good reason—particularly for anyone seeking to make sense of Kodak’s fate. Located a couple of miles away from downtown Rochester, the Eastman House was the most meretricious mansion on a wide, leafy street lined with the sprawling trophy homes of Gilded Age industrial magnates. Built by George Eastman between 1902 and 1904, it was declared a National Historic Landmark in 1966. Housing more than 400,000 photographs and 23,000 films as well as a collection of antique Kodak cameras, the Eastman House is now one of the world’s leading museums of photography and film.48
In “Kodachrome,” Paul Simon sang about being able to read what had been written on the wall. And the writing about Kodak’s fate was certainly written with a vivid, Kodachrome-like clarity on the Eastman House wall.
On a long white wall at the entrance to the museum was a timeline of the history of photography. Beginning in fifth-century BC China, with the first record of an optical device designed to capture light, the timeline on the wall included the creation of the first image from a camera obscura in 1826, the invention of the modern Zoetrope in 1834, Kodak’s introduction of the first mass-market camera for children in 1900, Tim Berners-Lee’s formal proposal to develop a protocol of the World Wide Web in 1992, Kodak’s decision to stop producing cameras in 2004, and the fact that 380 billion photos, a remarkable 11% of the total photographs ever taken, were snapped in 2011.49
The timeline ended in 2012 with four entries that could have been grouped under the headline REALLY CREATIVE DESTRUCTION:
— Eastman Kodak company files for bankruptcy under CH 11
— Instagram has over 14 million users and hosts about 1 billion photographs
— There are over 6 billion photographs on Flickr
— There are over 500 billion photographs on Facebook
And that was it. The timeline concluded there. The writing was indeed on the wall for Rochester and Kodak. Silicon Valley sledgehammers like Flickr and Facebook had smashed old Rochester into smithereens. The Kodak economy had been replaced by the Facebook economy with its 500 billion free photographs. No wonder the Eastman House, as a memorial to a now-extinct analog industry, had been transformed into a museum. No wonder the only people who now visit Rochester come to gaze nostalgically at its past rather than imagine its future.
The Broken Center
“So what?” apologists of radical disruption like Tom Perkins will ask about Kodak’s usurpation by Internet companies like Instagram, Flickr, and Facebook. Tragedy in Rochester, they will say, equals opportunity for West Coast entrepreneurs. Nostalgia, the determinists will remind us, is a Luddite indulgence. And the writing on the wall, they will remind us, eventually appears for everyone.
In some ways, of course, they are right. For better or worse, technological change—especially in our digital age of creative destruction—is pretty much inevitable. Paul Simon himself once described this to me with a bittersweet regret. “I’m personally against Web 2.0 in the same way as I’m personally against my own death,” he said about the damage unleashed by the Internet upon the music industry—a hurricane that has flattened both the traditional labels and the economic value of recorded music.50 “We’re going to 2.0,” Simon predicted to me. “Like it or not, that is what is going to happen.”51
“There will be no more Kodak Moments. After 133 years, the company has run its course,” Don Strickland, the Kodak executive who had unsuccessfully encouraged the company to pioneer the digital camera, thus concluded in 2013.52 “Kodak was caught in a perfect storm of not only technological, but also social and economic change,” added Robert Burley, a Canadian photographer whose work memorializes Kodak’s decline.53
No, nothing lasts forever. And certainly the Kodak tragedy can be seen, at least in part, as a parable of a once-mighty monopolist, a Google of the industrial age, that couldn’t adapt to the digital revolution. Yes, Kodak failed to become a leader in digital photography, in spite of the fact that the company actually invented the digital camera, back in 1975.54 Yes, Kodak is, in part, a victim of what Harvard Business School professor Clayton Christensen calls, in his influential 2011 book about why businesses fail, The Innovator’s Dilemma,55 the challenge of a once-dominant incumbent having to disrupt its own business model. Yes, a string of myopic executives failed to reinvent Kodak, with the result that a great company that up until the 1990s was often listed among the world’s top five most valuable brands56 has now become synonymous with failure. And yes, tragedy in Rochester spells economic opportunity elsewhere, particularly for West Coast entrepreneurs like Jeff Bezos, who has made Christensen’s Innovator’s Dilemma required reading for all Amazon executives.57
“Maybe a fire is what’s needed for a vigorous new growth, but that’s the long view,” Paul Simon said to me about the Internet’s disruptive impact on creative industries like recorded music and p
hotography. “In the short term, all that’s apparent is the devastation.” But what happens if the devastation is not only permanent, but also the defining feature of our now twenty-five-year-old digital economy? What happens if the tragedy in Rochester is actually a sneak preview of our collective future—a more universal perfect storm of technological, social, and economic change?
Welcome to what Joshua Cooper Ramo, the former executive editor of Time magazine, calls “the age of the unthinkable.” It’s a networked age, Ramo says, in which “conformity to old ideas is lethal”58 and predictability and linearity have been replaced by what he calls an “epidemic” of self-organization where no central leadership is required. This is an age so destructively unthinkable, in fact, that Clayton Christensen’s theory of the “Innovator’s Dilemma,” which suggests an orderly cycle of disruptors, each replacing the previous economic incumbent, now has itself been blown up by an even more disruptive theory of early-twenty-first-century digital capitalism.
Christensen’s ideas have themselves been reinvented by the bestselling business writers Larry Downes and Paul F. Nunes, who’ve replaced the “Innovator’s Dilemma” with the much bleaker “Innovator’s Disaster.” “Nearly everything you think you know about strategy and innovation is wrong,” Downes and Nunes warn about today’s radically disruptive economy.59 In their 2014 book, Big Bang Disruption,60 they describe an economy in which disruption is devastating rather than creative. It’s a world, they say, in which Joseph Schumpeter’s “perennial gales of creative destruction” have become Category 5 hurricanes. Upheavals from big-bang disruptors like Google, Uber, Facebook, and Instagram “don’t create dilemmas for innovators,” Downes and Nunes warn, “they trigger disasters.”61 And Kodak is the textbook example of this kind of disaster—a $31 billion company employing 145,000 people that, as they note, was bankrupted “gradually and then suddenly”62 by the hurricane from Silicon Valley.
“An entire city,” Jason Farago wrote about the impact of the Kodak bankruptcy on Rochester, “has lost its center.”63 But the real disaster of the digital revolution is much more universal than this. In today’s networked age, it’s our entire society that is having its center destroyed by a “perfect storm” of technological, social, and economic change. The twentieth-century industrial age, while far from ideal in many ways, was distinguished by what George Packer, writing in the New York Times, calls the “great leveling” of the “Roosevelt Republic.”64 For hard-line neoliberals like Tom Perkins, Packer’s “great leveling” probably raises the specter of a socialist dystopia. But for those not fortunate enough to own a $130 million yacht as long as a football field, this world offered an economic and cultural center, a middle ground where jobs and opportunity were plentiful.
The late industrial age of the second half of the twentieth century was a middle-class world built, Packer notes, by “state universities, progressive taxation, interstate highways, collective bargaining, health insurance for the elderly, credible news organizations.”65 According to the Harvard economists Claudia Goldin and Lawrence Katz, this was a “golden age” of labor in which increasingly skilled workers won the “race between education and technology” and made themselves essential to the industrial economy.66 And, of course, it was a world of publicly funded institutions like ARPA and NSFNET that provided the investment and opportunities to build valuable new technologies like the Internet.
But this world, Rochester’s fate reminds us, is now passing. As Sequoia Capital chairman Michael Moritz reminds us, today’s information economy is marked by an ever-increasing inequality between an elite and everyone else. It’s a donut-shaped economy without a middle. Moritz thus describes as “brutal” both the drop between 1968 and 2013 in the US minimum wage (when inflation is accounted for) from $10.70 to $7.25 and the flattening of a median household income that, not even accounting for inflation, has crawled up from $43,868 to $52,762 over the same forty-five-year period.67
According to the New York Times columnist David Brooks, this inequality represents capitalism’s “greatest moral crisis since the Great Depression.”68 It’s a crisis, Brooks says, that can be captured in two statistics: the $19 billion Facebook acquisition of the fifty-five-person instant messaging Internet app WhatsApp in February 2014, which valued each employee at $345 million; and the equally disturbing fact that the slice of the economic pie for the middle 60 percent of earners in the US economy has dropped from 53 percent to 45 percent since 1970. The Internet economy “produces very valuable companies with very few employees,” Brooks says of this crisis, while “the majority of workers are not seeing income gains commensurate with their productivity levels.”69
In his 2013 National Book Award–winning The Unwinding, George Packer mourns the passing of the twentieth-century Great Society. What he calls “New America” has been corrupted, he suggests, by its deepening inequality of wealth and opportunity. And it’s not surprising that Packer places Silicon Valley and the multibillionaire Internet entrepreneur and libertarian Peter Thiel at the center of his narrative.
The cofounder, with Elon Musk, of the online payments service PayPal, Thiel became a billionaire as the first outside investor in Facebook, after being introduced to Mark Zuckerberg by Sean Parker, the cofounder of Napster and Facebook’s founding president. The San Francisco–based Thiel lives in a “ten thousand square foot white wedding cake of a mansion,” 70 a smaller but no less meretricious building than the Battery. His decadent house and dinner parties are the stuff of San Francisco high-society legend, featuring printed menus, unscheduled Gatsby-like appearances from the great Thiel himself, and waiters wearing nothing except their aprons. The reclusive Thiel has reinvented himself as a semi-mythical figure—a Gatsby meets Howard Hughes meets Bond villain. He’s a Ferrari-driving, Stanford-educated moral philosopher, a chess genius and multibillionaire investor who is accompanied everywhere by a “staff of two blond, black-clad female assistants, a white-coated butler and a cook who prepares a daily health drink of celery, beets, kale, and ginger.”71
It would be easy, of course, to dismiss Peter Thiel as an eccentric with cash. But that’s the least interesting part of his story. He is, in fact, an even richer, smarter, and—as a major funder of radical American libertarians like Rand Paul and Ted Cruz—more powerful version of Tom Perkins. Peter Thiel has everything: brains, charm, prescience, intellect, charisma; everything, that is, except compassion for those less successful than him. In the increasingly unequal America described in Packer’s The Unwinding, Thiel is the supreme unwinder, a hard-hearted follower of Ayn Rand’s radical free-market philosophy who unashamedly celebrates the texture of inequality now reshaping America.
“As a Libertarian,” Packer notes, “Thiel welcomed an America in which people could no longer rely on old institutions or get by in communities with long-standing sources of security, where they knew where they stood and what they were bound for.”72 Thiel would, therefore, certainly welcome today’s age of the unthinkable, in which conformity to old ideas is lethal. He would probably welcome the sad fate of old industrial towns like Rochester. He might even welcome the sadder fate of those fifty thousand retirees at Kodak who lost their pensions because of the company’s bankruptcy.
So what? Thiel might say about these impoverished old people who spent their entire lives working for Kodak and who no longer even own their pensions. So what? the multibillionaire with the black-clad female assistants, the white-coated butler, and the cook might say about today’s libertarian age, in which a twenty-first-century networked capitalism is collapsing the center of twentieth-century economic life.
You Better Watch Out
To pin all the blame for society’s broken center on the Internet would, of course, be absurd. However, Internet economics are now compounding the growing silicon chasm in society. Robert Franks and Philip Cook’s 1995 The Winner-Take-All Society was one of the first books to recognize the corrosive impact of information technology on economic equality. Up till then, it was ass
umed that technological innovation was beneficial for society. Vannevar Bush, in the “Science, the Endless Frontier” report he wrote for Roosevelt in 1945, took it for granted that constant scientific and technological progress would inevitably lead to both more jobs and general prosperity. And this optimism was reflected in the work of the MIT economist Robert Solow, whose 1987 Nobel Prize in Economics was awarded for his research showing that over the long term, labor and capital maintained their share of rewards in a growing economy. But even Solow, whose research was mostly based on productivity improvements from the 1940s, ’50s, and ’60s, later became more skeptical of labor’s ability to maintain its parity with capital in terms of reaping the rewards of more economic productivity. In a 1987 New York Times Book Review piece titled “We’d Better Watch Out,” he acknowledged that what he called “Programmable Automation” hadn’t increased labor productivity. “You can see the computer age everywhere,” he memorably put it, “but in the productivity statistics.”73
Timothy Noah, the author of The Great Divergence, a well-received book on America’s growing inequality crisis, admits that computer technology does create jobs. But these, he says, are “for highly skilled, affluent workers,” whereas the digital revolution is destroying the jobs of “moderately skilled middle class workers.”74 The influential University of California, Berkeley economist and blogger J. Bradford DeLong has suggested that the more central a role information technology plays in traditionally skillful professions like law or medicine, the fewer jobs there might be.75 Loukas Karabarbounis and Brent Neiman, two economists from the University of Chicago’s business school, have found that since the mid-1970s, the relative amount of income going to workers has been in decline around the world.76 Meanwhile the research of three Canadian economists, Paul Beaudry, David Green, and Benjamin Sand, found a similarly steep decline of midlevel jobs—a depressing development that MIT’s David Autor, Northeastern University’s Andrew Sum, and the president of the Economic Policy Institute, Larry Mishel, have also discovered with their research.77