Book Read Free

Smart Mobs

Page 18

by Howard Rheingold


  The conversations on Usenet include valuable information in individual postings far beyond the highly distilled information published in FAQs. The problem with Usenet conversations as a repository of knowledge is the difficulty of finding the useful tidbits among the tidal waves of chit-chat and endless flame wars. In 1992 Paul Resnick and his colleagues at the University of Michigan created GroupLens software, which allowed readers to rate Usenet messages and make ratings available to others on request.7 Resnick, along with many others, has continued to conduct research on reputation systems during the decade since GroupLens.8 GroupLens continues to provide a free movie recommendation service on the Web.9

  Automated collaborative filtering systems work best whenever there is a low risk to making a bad decision, such as buying a book or a movie ticket. Amazon.com and other e-commerce sites use collaborative filtering systems to make suggestions to regular customers. Systems that can tell people what they want to buy next are bound to coevolve along with e-commerce. When the risk increases and choices involve larger amounts of money, what happens to trust? eBay’s reputation system answers this question with remarkable success. When the currency of a social filtering system changes from knowledge or social recognition to money, the evolution of this social technology forks into lineages of reputation systems that deal with markets and those that deal with recommendations. Later in this chapter I’ll return to the role of reputation systems in markets. Both the systems that deal with knowledge and recommendations and those that mediate transactions in financial markets could come into play when future populations of wearable computer users form ad hoc networks around what they know and trust about each other.

  Any ways that groups can share knowledge efficiently in the course of online conversation without doing anything beyond having conversation could generate real power in scientific and business communities. Even simple instruments that enable groups to share knowledge online by recommending useful Web sites, without requiring any action by the participants beyond bookmarking them, can multiply the groups’ effectiveness.

  In 1997, Hui Guo, Thomas Kreifelts, and Angi Voss of the German National Research Center for Information Technology described their “SOaP” social filtering service designed to address several of the problems constraining recommender systems.10 Guo and his colleagues created software agents, programs that could search, query, gather information, report results, even negotiate and execute transactions with other programs. The SOaP agents could implicitly collect recommendation information by the members of a group and mediate among people, groups, and the Web. At the most implicit level, SOaP agents can collect and cluster URLs that members of the group bookmark in the course of their work. If a member of the group is interested enough to bookmark a site, an implicit recommendation has been made. People who wish to make more explicit annotations are able to do so:

  In our system, information is filtered by communicative social agents which collect human users’ assessments and match users’ interests to derive recommendations. With agents, it is possible for users to find relevant bookmarks regarding specific topics, find people with similar interests, find groups with similar topics, and also to form groups for direct cooperation.

  In order to perform the services, agents in our system use knowledge about users, groups of users, the topics that are relevant to a user, the URLs that a user considers relevant to a topic, and a user’s assessments of a URL, e.g., his or her ratings and annotations, in the context of a particular group or in connection with a particular topic. According to our design principle, this knowledge should be obtained without effort on the part of the user, or else it should be optional.11

  The key to aggregating knowledge through the use of SOaP by a group is social, not technical: The degree of trust among members is determined by each group that uses it. This allows for a variety of communities and degrees of trust. Huge populations of strangers known to one another only by their postings—Usenet, for example—can exchange a rich variety of low-trust recommendations. A smaller invitational network of experts, a self-organized community of interest, or an organizational unit can use the same system to increase their individual and collective knowledge. Individuals are given the power to determine whether their annotations and surfing habits are published, and to whom, but all they need to do at the minimum is surf the Web and bookmark sites that interest them.

  Brewster Kahle and Bruce Gilliat created a Web surfers’ collaborative filtering system, Alexa Internet, in 1996.12 Alexa is an implicit filtering system: When a person using it visits a Web site, the person’s Web browser provides a menu of Web sites that have been visited by other surfers who have visited the same page. Alexa requires users to install additional software that records their choices as they navigate through the Web and adds data about their choices to the database. Alexa is an instance of a “cornucopia of the commons,” which provisions the resource it consumes; users contribute to the database in the act of using it. At the same time it harvests their decisions, the installed software also offers users recommendations of other sites to investigate. Technology in the Alexa lineage continues to be available as a “related sites” feature for common browsers. Amazon acquired the company in 1999.

  Ego Strokes, Opinion Markets, and Bozofilters

  During the last months of the twentieth century, a few Internet startups burned millions of dollars attempting to profit from online knowledge-sharing communities. Many of these startups succeeded in attracting communities of experts who competed to make the enterprise more valuable by contributing their knowledge. The volunteer mavens on everything from hummingbirds to Sumerian antiquities took their payment in pennies and prestige. The opposite of the free-rider problem emerged in a number of forms—hordes of compulsive contributors.

  The decades-old friendly competitions to provide answers online became a commercial enterprise toward the end of the dotcom era when expert opinion, advice, and recommendation Web sites such as Epinions, Askme.com, Experts-Exchange, Allexperts.com, ExpertCentral.com, and Abuzz.com launched. Most of these enterprises failed when Web-based advertising revenues sank, but many were markedly successful in producing high-quality evaluations about everything from computer programming problems to comic book collections. People with expertise contributed answers, tidbits, essays, pages of software code, lore of astonishing variety. A few contributors earned the kind of currency banks accept. Most contributed for the social recognition that came with being a top-ranked reviewer. The “reputation managers” that enabled users and other recommenders to rate each other made possible opinion markets that traded almost entirely in ego gratification.

  Epinions, launched in September 1999, continued to thrive in 2002, unlike many of its competitors, and it continued to host an active community of evaluation providers.13 “Epinionators” are paid between one to three cents every time another registered member reads and rates their 1001,000 word reviews. Well-rated reviews rise to the top of their categories’ list. A very few contributors even make a living at it, yet hundreds continue to provide evaluations of thousands of products and services. If you can use it and pay for it, you can find an Epinion about it. Members can rate each review as “Highly Recommended,” “Recommended,” “Somewhat Recommended, or “Not Recommended.” Members can click a button next to the name of an Epinionator and add him or her to a personal “web of trust.” People who trust each other inherit each other’s webs of trust. Although webs of trust are an official feature of Epinions, the first web of mistrust appeared spontaneously, created by a user.

  Epinions continuously publishes updated ratings for the community to see. This feature is mentioned by some habitual users who joke about their prolific contributions as a compulsion: “I am addicted to a drug called Epinions. I have to keep going back for more,” one of the top-rated Epinionators confessed on a message board.”14 Instant social approval can be intoxicating. Science fiction fans of the 1920s who published amateur “fanzines” invented a word f
or this kind of personal gratification: “egoboo,” derived from “ego boost,” is the coin of online social prestige in expert opinion communities.15

  Above all, Epinions is a social network. “Epinions is one of the most active and varied ecosystems on the Web,” Wired editor Mark Frauenfelder wrote. “It has evolved into a diverse community populated by cliques, clowns, parasites, symbiotes, self-appointed cops, cheaters, flamers, and feuders. It’s swarming with people who were English or journalism majors but ended up stuck in other careers. And it has produced member-generated site refinements, such as the Web of Distrust.”16

  The methodology of self-organizing knowledge communities lives on in many forms, although the recommender community industry is a tale of yesteryear. Most self-organizing Web sites now are partially or totally noncommercial. Some self-organizing sites subscribe to the open-source philosophy. Other sites are purely amateur but incorporate quality control. A 2001 story in the New York Times spotlighted “The Vines Network,” an “Encyclopedia of Everything” written by volunteers for free, in which readers rate content on a 110 point scale.17 Collectively written encyclopedias mushroomed into a genre of its own, including virtual subcultures. Another self-organized encyclopedia, “everything2.com” drew a rich community of regulars organized around a rating system and message boards.18 At everything2, the contributors of knowledge grow an encyclopedia around conversations about sports, politics, classical history, chihuahuas, or quantum physics.

  Another new online subculture has grown out of the self-published Web-surfing diaries known as “blogs,” short for “weblogs.”19 Blogging software made it easy for anyone to update a simple Web site frequently. Wired News estimated that 500,000 blogs had been created by February 2002.20 Blogs, sorted by type of content, come in thousands of flavors, but almost all are updated regularly, include links to favorite sites, concentrate on a theme or interest, and include commentary on the mentioned sites. Sometimes blogs are like diaries. Others are like fanzines or indexes to subcultures. Almost every blog includes a list of related or favorite blogs and “discusses” links that enable communities to form. Rings of blogs about similar interests self-organize. Communities of shared affinity emerge spontaneously from discussions. MIT Professor Henry Jenkins describes the power blogging has to reframe issues:

  Imagine a world where there are two kinds of media power: one comes through media concentration, where any message gains authority simply by being broadcast on network television; the other comes through grass-roots intermediaries, where a message gains visibility only if it is deemed relevant to a loose network of diverse publics. Broadcasting will place issues on the national agenda and define core values; bloggers will reframe those issues for different publics and ensure that everyone has a chance to be heard.21

  The liberating news about virtual communities is that you don’t have to be a professional writer, artist, or television journalist in order to express yourself to others. Everyone can be a publisher or a broadcaster now. Many-to-many communications media have proved to be popular and democratic. Evidence: the history of Usenet. The disappointing news about virtual communities is that you don’t have to be civil, capable of communicating coherently, or know what you are talking about in order to express yourself to others. Evidence: the history of Usenet. Some people proclaim opinions that are so abhorrent or boring, use such foul language, or are such poor communicators that they sour discussions that would otherwise be valuable to the majority of participants. Some people have a voracious need for attention and don’t care whether it is negative attention. Other people use the shield of anonymity to unleash their aggressions, bigotry, and sadistic impulses.

  The presence of flamers, bullies, bigots, charlatans, know-nothings, and nuts in online discourse poses a classic tragedy of the commons dilemma. If too many people take advantage of open access to seek other people’s attention, the excesses of the free riders drive away the people who make the conversation valuable. Online media that support social communication have a defensive capability that face-to-face socializing lacks: It is possible for civilized conversationalists to tune out those who abuse the conversational commons.

  Rules about behavior would be unthinkable to Usenet’s anarchic culture. The programmers who populated the early Usenet proclaimed allegiance to “tools, not rules.” In the case of conversational noise, Usenet devotees adopted a program known as the “killfile,” which enabled people to eliminate from their view (but not that of others) specific words, postings of particular people, or even certain topics of conversations. The words, postings, and topics themselves are unaffected, but killfile users render them invisible to themselves. On the Well, and in other online communities, an equivalent program came to be known as the “bozofilter.”

  Hiding crap is the easy part. The real achievement is finding quality. The notion that people can use software to filter conversations according to individual preferences has continued to evolve, and other instruments have begun to emerge for enabling both democratic access and tuning out semantic noise. The open source community in particular has been a fertile ground for social reputation systems. Tens of thousands of freelance programmers who work collectively to create and share open source software and who collaborate on paying projects tend to live online. They have a strong ethic of sharing and a strong aversion to censorship. In 1998, an open source programmer, Rob Malda, started a small discussion forum to talk about programming, issues in the news, and pop culture: “News for Nerds, Stuff That Matters,” proclaimed the front page. Official editors would select relevant stories every day, post links and commentary, and the Slashdot community around the world added commentary in the form of sequential posts. Malda, who goes by the online handle “Commander Taco,” called it “Slashdot,” after a commonly used Linux command. Malda later wrote: “We got dozens of posts each day, and it was good. The signal was high, the noise was low.”22

  The time was right for a virtual watering hole to appear as a hangout for the programmers around the world who shared the open source zeitgeist. The Slashdot population grew, and soon there were too many posts to police and too much noise to ignore. Malda chose twenty-five people to help. They deleted spam and awarded points to posts that seemed valuable. Then the Slashdot population grew unmanageable even for twenty-five volunteers. By 1999, if a link to a Web site was posted as a top-level story on Slashdot, that Web site would get so many hits that host servers often crashed, a phenomenon that came to be known Netwide as “being Slash-dotted.” The original twenty-five moderators chose four hundred more. The Slashdot karma system emerged to filter out noise, point out good postings, and protect against abuse of power from moderators.

  When a registered user logs in often enough and reads postings over a sustained period, Slashdot’s “Slashcode” software automatically puts that user in a pool of candidates for jury-like service. Randomly selected “moderators” from the pool of regulars are given a limited number of points to use in rating posts of other members, and when they expend those points, their term of service is over until they are selected again. Moderators can use their points to rate postings on a scale of 1 to +5, and they can attach annotations such as “flamebait” or “informative.” Posters can choose to remain anonymous, in which case they start with a karma setting of zero and their posts are labeled “Anonymous Coward.” Registered posters, who can use pseudonyms, start out with a karma setting of +1. Moderators use their allotment of points to raise or lower the settings of selected posts and hence affect the karma of the selected posters.

  Slashdot readers can use a menu to set their “quality filter” reading level. Some readers can choose to read every one of hundreds of posts in a particular discussion; others can set their quality filter to read only those with a rating of 3 or above, usually reducing the number of posts to dozens, or set their quality filter to show only those with the highest rating of five, sometimes reducing a thread of hundreds of posts to a handful.

  By 2001, the
Slashdot community of registered users exceeded 300,000. At that scale, there was no way to organize except self-organize. Malda and friends tinkered with the reputation system in response to community use and abuse, adhering to four design goals:

  Promote quality, discourage crap.

  Make Slashdot as readable as possible for as many people as possible.

  Do not require a huge amount of time from any single moderator.

  Do not allow a single moderator a “reign of terror.”23

  The Slashdot system evolved several refinements. Moderators cannot post in the same conversations they moderate, and metamoderators are randomly chosen to assign points to the moderators’ choices, providing a shield against moderator abuse. Because Slashcode is open source, other groups copied it, set up their own discussions, and began to modify the code to create their own variations.

  Reputation is even more important in commerce than it is in conversation. Without some kind of trust metric, e-commerce never would have become possible. Although the number of e-businesses has been reduced from thousands to a smaller number of larger enterprises, eBay, the most successful electronic marketplace, combined e-commerce, online affinity groups, and reputation management.

  Restoring the Shadow of the Future

  In 1995, Pierre Omidyar created eBay so that his wife could trade Pez dispensers— a form of packaging for candy now valued by collectors. The Omidyars are billionaires now—from creating an electronic marketplace, not from trading Pez dispensers. In 2000, eBay users transacted more than $5 billion in gross merchandise sales. By 2002, eBay had more than 42 million registered users and was the most popular shopping site on the Internet.24 Millions of items are listed for sale on any given day in thousands of categories. eBay offers no warranty for its auctions; it merely puts buyers and sellers together, gives them a place to display pictures of their wares, automatically manages auctions, provides a reputation management system, and takes a small listing fee.

 

‹ Prev