The Digital Divide

Home > Other > The Digital Divide > Page 24
The Digital Divide Page 24

by Mark Bauerlein


  Even without sensor-driven purchasing, real-time information is having a huge impact on business. When your customers are declaring their intent all over the Web (and on Twitter)—either through their actions or their words, companies must both listen and join the conversation. Comcast has changed its customer service approach using Twitter; other companies are following suit.

  Another striking story we’ve recently heard about a real-time feedback loop is the Houdini system used by the Obama campaign to remove voters from the Get Out the Vote calling list as soon as they had actually voted. Poll watchers in key districts reported in as they saw names crossed off the voter lists; these were then made to “disappear” from the calling lists that were being provided to volunteers. (Hence the name Houdini.)

  Houdini is Amazon’s Mechanical Turk writ large: one group of volunteers acting as sensors, multiple real-time data queues being synchronized and used to affect the instructions for another group of volunteers being used as actuators in that same system.

  Businesses must learn to harness real-time data as key signals that inform a far more efficient feedback loop for product development, customer service, and resource allocation.

  >>> in conclusion: the stuff that matters

  All of this is in many ways a preamble to what may be the most important part of the Web Squared opportunity.

  The new direction for the Web, its collision course with the physical world, opens enormous new possibilities for business, and enormous new possibilities to make a difference on the world’s most pressing problems.

  There are already hundreds of examples of this happening. But there are many other areas in which we need to see a lot more progress—from our energy ecosystem to our approach to health care. Not to mention our financial system, which is in disarray. Even in a pro-regulatory environment, the regulators in government are hopelessly outclassed by real-time automated financial systems. What have we learned from the consumer Internet that could become the basis for a new twenty-first-century financial regulatory system? We need machine learning to be applied here, algorithms to detect anomalies, transparency that allows auditing by anyone who cares, not just by overworked understaffed regulators.

  When we started the Web 2.0 events, we stated that “the Web is a platform.” Since then, thousands of businesses and millions of lives have been changed by the products and services built on that platform.

  But 2009 marks a pivot point in the history of the Web. It’s time to leverage the true power of the platform we’ve built. The Web is no longer an industry unto itself—the Web is now the world.

  And the world needs our help.

  If we are going to solve the world’s most pressing problems, we must put the power of the Web to work—its technologies, its business models, and perhaps most important, its philosophies of openness, collective intelligence, and transparency. And to do that, we must take the Web to another level. We can’t afford incremental evolution anymore.

  It’s time for the Web to engage the real world. Web meets World—that’s Web Squared.

 

  web 2.0: the second generation of the internet has arrived and it’s worse than you think

  Originally published in The Weekly Standard (February 14, 2006).

  Writer and entrepreneur ANDREW KEEN is the author of The Cult of the Amateur: How the Internet Is Killing Our Culture (2007). Keen founded Audiocafe.com in 1995 and is currently the host of the “Keen On” show on Techcrunch.tv. His new book about the social media revolution, Digital Vertigo: An Anti-Social Manifesto, will be published by St. Martin’s Press in 2012. You can follow him on Twitter at twitter.com/ajkeen.

  THE ANCIENTS were good at resisting seduction. Odysseus fought the seductive song of the Sirens by having his men tie him to the mast of his ship as it sailed past the Sirens’ Isle. Socrates was so intent on protecting citizens from the seductive opinions of artists and writers that he outlawed them from his imaginary republic.

  We moderns are less nimble at resisting great seductions, particularly those utopian visions that promise grand political or cultural salvation. From the French and Russian revolutions to the countercultural upheavals of the ’60s and the digital revolution of the ’90s, we have been seduced, time after time and text after text, by the vision of a political or economic utopia.

  Rather than Paris, Moscow, or Berkeley, the grand utopian movement of our contemporary age is headquartered in Silicon Valley, whose great seduction is actually a fusion of two historical movements: the countercultural utopianism of the ’60s and the techno-economic utopianism of the ’90s. Here in Silicon Valley, this seduction has announced itself to the world as the “Web 2.0” movement.

  Last week, I was treated to lunch at a fashionable Japanese restaurant in Palo Alto by a serial Silicon Valley entrepreneur who, back in the dot-com boom, had invested in my start–up Audiocafe .com. The entrepreneur, like me a Silicon Valley veteran, was pitching me his latest start-up: a technology platform that creates easy-to-use software tools for online communities to publish weblogs, digital movies, and music. It is technology that enables anyone with a computer to become an author, a film director, or a musician. This Web 2.0 dream is Socrates’ nightmare: technology that arms every citizen with the means to be an opinionated artist or writer.

  “This is historic,” my friend promised me. “We are enabling Internet users to author their own content. Think of it as empowering citizen media. We can help smash the elitism of the Hollywood studios and the big record labels. Our technology platform will radically democratize culture, build authentic community, create citizen media.” Welcome to Web 2.0.

  Buzzwords from the old dot-com era—like “cool,” “eyeballs,” or “burn-rate”—have been replaced in Web 2.0 by language which is simultaneously more militant and absurd: Empowering citizen media, radically democratize, smash elitism, content redistribution, authentic community.... This sociological jargon, once the preserve of the hippie counterculture, has now become the lexicon of new media capitalism.

  Yet this entrepreneur owns a $4 million house a few blocks from Steve Jobs’s house. He vacations in the South Pacific. His children attend the most exclusive private academy on the peninsula. But for all of this he sounds more like a cultural Marxist—a disciple of Gramsci or Herbert Marcuse—than a capitalist with an MBA from Stanford.

  In his mind, “big media”—the Hollywood studios, the major record labels and international publishing houses—really did represent the enemy. The promised land was user-generated online content. In Marxist terms, the traditional media had become the exploitative “bourgeoisie,” and citizen media, those heroic bloggers and podcasters, were the “proletariat.”

  This outlook is typical of the Web 2.0 movement, which fuses’60s radicalism with the utopian eschatology of digital technology. The ideological outcome may be trouble for all of us.

  So what, exactly, is the Web 2.0 movement? As an ideology, it is based upon a series of ethical assumptions about media, culture, and technology. It worships the creative amateur: the self-taught filmmaker, the dorm-room musician, the unpublished writer. It suggests that everyone—even the most poorly educated and inarticulate amongst us—can and should use digital media to express and realize themselves. Web 2.0 “empowers” our creativity, it “democratizes” media, it “levels the playing field” between experts and amateurs. The enemy of Web 2.0 is “elitist” traditional media.

  Empowered by Web 2.0 technology, we can all become citizen journalists, citizen videographers, citizen musicians. Empowered by this technology, we will be able to write in the morning, direct movies in the afternoon, and make music in the evening.

  Sounds familiar? It’s eerily similar to Marx’s seductive promise about individual self-realization in his German Ideology: Whereas in communist society, where nobody has one exclusive sphere of activity but each can become accomplished in any branch he wishes, society regulates the general production and thus makes it possible for me t
o do one thing today and another tomorrow, to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner, just as I have a mind, without ever becoming hunter, fisherman, shepherd or critic.

  Just as Marx seduced a generation of European idealists with his fantasy of self-realization in a communist utopia, so the Web 2.0 cult of creative self-realization has seduced everyone in Silicon Valley. The movement bridges countercultural radicals of the ’60s such as Steve Jobs with the contemporary geek culture of Google’s Larry Page. Between the bookends of Jobs and Page lies the rest of Silicon Valley, including radical communitarians like Craig Newmark (of Craigslist.com), intellectual property communists such as Stanford Law professor Larry Lessig, economic cornucopians like Wired magazine editor Chris “Long Tail” Anderson, and new media moguls Tim O’Reilly and John Battelle.

  The ideology of the Web 2.0 movement was perfectly summarized at the Technology Education and Design (TED) show in Monterey last year, when Kevin Kelly, Silicon Valley’s über-idealist and author of the Web 1.0 Internet utopia New Rules for the New Economy , said:Imagine Mozart before the technology of the piano. Imagine Van Gogh before the technology of affordable oil paints. Imagine Hitchcock before the technology of film. We have a moral obligation to develop technology.

  But where Kelly sees a moral obligation to develop technology, we should actually have—if we really care about Mozart, Van Gogh, and Hitchcock—a moral obligation to question the development of technology.

  The consequences of Web 2.0 are inherently dangerous for the vitality of culture and the arts. Its empowering promises play upon that legacy of the ’60s—the creeping narcissism that Christopher Lasch described so presciently, with its obsessive focus on the realization of the self.

  Another word for narcissism is “personalization.” Web 2.0 technology personalizes culture so that it reflects ourselves rather than the world around us. Blogs personalize media content so that all we read are our own thoughts. Online stores personalize our preferences, thus feeding back to us our own taste. Google personalizes searches so that all we see are advertisements for products and services we already use.

  Instead of Mozart, Van Gogh, or Hitchcock, all we get with the Web 2.0 revolution is more of ourselves.

  Still, the idea of inevitable technological progress has become so seductive that it has been transformed into “laws.” In Silicon Valley, the most quoted of these laws, Moore’s Law, states that the number of transistors on a chip doubles every two years, thus doubling the memory capacity of the personal computer every two years. On one level, of course, Moore’s Law is real and it has driven the Silicon Valley economy. But there is an unspoken ethical dimension to Moore’s Law. It presumes that each advance in technology is accompanied by an equivalent improvement in the condition of man.

  But as Max Weber so convincingly demonstrated, the only really reliable law of history is the Law of Unintended Consequences.

  We know what happened first time around, in the dot-com boom of the ’90s. At first there was irrational exuberance. Then the dot-com bubble popped; some people lost a lot of money and a lot of people lost some money. But nothing really changed. Big media remained big media and almost everything else—with the exception of Amazon.com and eBay—withered away.

  This time, however, the consequences of the digital media revolution are much more profound. Apple and Google and craigslist really are revolutionizing our cultural habits, our ways of entertaining ourselves, our ways of defining who we are. Traditional “elitist” media is being destroyed by digital technologies. Newspapers are in free fall. Network television, the modern equivalent of the dinosaur, is being shaken by TiVo’s overnight annihilation of the thirty-second commercial. The iPod is undermining the multibillion-dollar music industry. Meanwhile, digital piracy, enabled by Silicon Valley hardware and justified by Silicon Valley intellectual property communists such as Larry Lessig, is draining revenue from established artists, movie studios, newspapers, record labels, and songwriters.

  Is this a bad thing? The purpose of our media and culture industries—beyond the obvious need to make money and entertain people—is to discover, nurture, and reward elite talent. Our traditional mainstream media has done this with great success over the last century. Consider Alfred Hitchcock’s masterpiece Vertigo, and a couple of other brilliantly talented works of the same name: the 1999 book called Vertigo, by Anglo-German writer W. G. Sebald, and the 2004 song “Vertigo,” by Irish rock star Bono. Hitchcock could never have made his expensive, complex movies outside the Hollywood studio system. Bono would never have become Bono without the music industry’s super-heavyweight marketing muscle. And W. G. Sebald, the most obscure of this trinity of talent, would have remained an unknown university professor had a high-end publishing house not had the good taste to discover and distribute his work. Elite artists and an elite media industry are symbiotic. If you democratize media, then you end up democratizing talent. The unintended consequence of all this democratization, to misquote Web 2.0 apologist Thomas Friedman, is cultural “flattening.” No more Hitchcocks, Bonos, or Sebalds. Just the flat noise of opinion—Socrates’s nightmare.

  While Socrates correctly gave warning about the dangers of a society infatuated by opinion in Plato’s Republic, more modern dystopian writers—Huxley, Bradbury, and Orwell—got the Web 2.0 future exactly wrong. Much has been made, for example, of the associations between the all-seeing, all-knowing qualities of Google’s search engine and the Big Brother in 1984. But Orwell’s fear was the disappearance of the individual right to self-expression. Thus Winston Smith’s great act of rebellion in 1984 was his decision to pick up a rusty pen and express his own thoughts:The thing that he was about to do was open a diary. This was not illegal, but if detected it was reasonably certain that it would be punished by death . . . Winston fitted a nib into the penholder and sucked it to get the grease off . . . He dipped the pen into the ink and then faltered for just a second. A tremor had gone through his bowels. To mark the paper was the decisive act.

  In the Web 2.0 world, however, the nightmare is not the scarcity, but the overabundance of authors. Since everyone will use digital media to express themselves, the only decisive act will be to not mark the paper. Not writing as rebellion sounds bizarre—like a piece of fiction authored by Franz Kafka. But one of the unintended consequences of the Web 2.0 future may well be that everyone is an author, while there is no longer any audience.

  Speaking of Kafka, on the back cover of the January 2006 issue of Poets and Writers magazine, there is a seductive Web 2.0 style advertisement that reads:Kafka toiled in obscurity and died penniless. If only he’d had a website . . .

  Presumably, if Kafka had had a website, it would be located at kafka.com, which is today an address owned by a mad left-wing blog called The Biscuit Report. The front page of this site quotes some words written by Kafka in his diary:I have no memory for things I have learned, nor things I have read, nor things experienced or heard, neither for people nor events; I feel that I have experienced nothing, learned nothing, that I actually know less than the average schoolboy, and that what I do know is superficial, and that every second question is beyond me. I am incapable of thinking deliberately; my thoughts run into a wall. I can grasp the essence of things in isolation, but I am quite incapable of coherent, unbroken thinking. I can’t even tell a story properly; in fact, I can scarcely talk. . . .

  One of the unintended consequences of the Web 2.0 movement may well be that we fall, collectively, into the amnesia that Kafka describes. Without an elite mainstream media, we will lose our memory for things learned, read, experienced, or heard. The cultural consequences of this are dire, requiring the authoritative voice of at least an Allan Bloom, if not an Oswald Spengler. But here in Silicon Valley, on the brink of the Web 2.0 epoch, there no longer are any Blooms or Spenglers. All we have is the great seduction of citizen media, democratized content and authentic online communities. And weblogs, course. Millions and millions
of blogs.

 

  wikipedia and beyond: jimmy wales’s sprawling vision

  Originally published in Reason magazine (June 2007).

  KATHERINE MANGU-WARD is a senior editor at Reason magazine. Previously, Mangu-Ward worked as a reporter for The Weekly Standard magazine and as a researcher on The New York Times op-ed page. Her work has appeared in The Wall Street Journal, Washington Post, The Los Angeles Times, New York Timesonline, and numerous other publications. She blogs at reason.com.

  JIMMY WALES, the founder of Wikipedia, lives in a house fit for a grandmother. The progenitor and public face of one of the ten most popular websites in the world beds down in a one-story bungalow on a cul-de-sac near St. Petersburg, Florida. The neighborhood, with its scrubby vegetation and plastic lawn furniture, screams “Bingo Night.” Inside the house, the décor is minimal, and the stucco and cool tile floors make the place echo. A few potted plants bravely attempt domesticity. Out front sits a cherry red Hyundai.

  I arrive at Wales’s house on a gray, humid day in December. It’s 11 a.m., and after wrapping up some e-mails on his white Mac iBook, Wales proposes lunch. We hit the mean streets of Gulf Coast Florida in the Hyundai, in search of “this really great Indian place that’s part of a motel,” and wind up cruising for hours—stopping at Starbucks, hitting the mall, and generally duplicating the average day of millions of suburban teenagers. Walmarts and Olive Gardens slip past as Wales, often taciturn and abrupt in public statements, lets loose a flood of words about his past, his politics, the future of the Internet, and why he’s optimistic about pretty much everything.

  Despite his modest digs, Wales is an Internet rock star. He was included on Time’s list of the 100 most influential people of 2006. Pages from Wikipedia dominate Google search results, making the operation, which dubs itself “the free encyclopedia that anyone can edit,” a primary source of information for millions of people. (Do a Google search for “monkeys,” “Azerbaijan,” “mass spectrometry,” or “Jesus,” and the first hit will be from Wikipedia.) Although he insists he isn’t a “rich guy” and doesn’t have “rich guy hobbies,” when pressed Wales admits to hobnobbing with other geek elites, such as Amazon founder Jeff Bezos, and hanging out on Virgin CEO Richard Branson’s private island. (The only available estimate of Wales’s net worth comes from a now-removed section of his own Wikipedia entry, pinning his fortune at less than $1 million.) Scruffy in a gray mock turtleneck and a closely cropped beard, the forty-year-old Wales plays it low-key. But he is well aware that he is a strangely powerful man. He has utterly changed the way people extract information from the chaos of the World Wide Web, and he is the master of a huge, robust online community of writers, editors, and users. Asked about the secret to Wikipedia’s success, Wales says simply, “We make the Internet not suck.”

 

‹ Prev