The Digital Divide

Home > Other > The Digital Divide > Page 17
The Digital Divide Page 17

by Mark Bauerlein


  Popular culture also helped to bridge the awkward silences in my exchanges with Sarah’s parents. I had wondered what a media scholar from “the People’s Republic of Cambridge” would say to two retired Air Force officers from Nebraska. As Sarah’s mother and I sat in the arcade, trying to dodge religion and politics, we found common ground discussing Star Trek, the original Saturday Night Live cast, and of course, Mutual of Omaha’s Wild Kingdom.

  Henry and Sarah broke up sometime after that trip—not because they had met online or because the real-life experience hadn’t lived up to their expectations but because they were fifteen, their interests shifted, and they never really overcame her father’s opposition. Henry’s next relationship was also online—with a girl from Melbourne, Australia, and that experience broadened his perspective on the world, at the price of much sleep as they negotiated time differences. Now twenty-one, he has gone through his normal share of other romantic entanglements, some online, more face-to-face (with many of the latter conducted, at least in part, online to endure the summer vacation separation).

  We’ve read more than a decade of press coverage about online relationships—much of it written since my son and I made this trip together. Journalists love to talk about the aberrant qualities of virtual sex. Yet, many of us embraced the Internet because it has fit into the most personal and banal spaces of our lives. Focusing on the revolutionary aspects of online courtship blinds us to the continuities in courtship rituals across generations and across media. Indeed, the power of physical artifacts (the imprint of lips on paper, the faded petals of a rose), of photographs, of the voice on the telephone, gain new poignancy in the context of these new relationships. Moreover, focusing on the online aspects of these relationships blinds us to the agility with which teens move back and forth across media. Their daily lives require constant decisions about what to say on the phone, what to write by hand, what to communicate in chat rooms, what to send by e-mail. They juggle multiple identities—the fictional personas of electronic wrestling, the constructed ideals of romantic love, and the realities of real bodies and real emotions.

  < Cathy Davidson >

  we can’t ignore the influence of digital technologies

  Originally published in The Chronicle of Higher Education (March 19, 2007).

  CATHY DAVIDSON is the John Hope Franklin Humanities Institute Professor of Interdisciplinary Studies and Ruth F. DeVarney Professor of English at Duke University. She is the codirector of the $2 million annual HASTAC/John D. and Catherine T. MacArthur Foundation Digital Media and Learning Competition. Her books include Revolution and the Word: The Rise of the Novel in America (1986) and The Future of Thinking: Learning Institutions in a Digital Age (with HASTAC cofounder David Theo Goldberg in 2010). She blogs on new media and learning at www.hastac.org as “Cat in the Stack.”

  WHEN I READ the other day that the history department at Middlebury College had “banned Wikipedia,” I immediately wrote to the college’s president, Ronald D. Liebowitz, to express my concern that such a decision would lead to a national trend, one that would not be good for higher education. “Banning” has connotations of evil or heresy. Is Wikipedia really that bad?

  I learned from Mr. Liebowitz that the news media had exaggerated the real story. The history department’s policy that students not cite Wikipedia in papers or examinations is consistent with an existing policy on not citing sources such as Encyclopaedia Britannica. It is hardly a “ban.” It is a definition of what constitutes credible scholarly or archival sources.

  Even granting that the news media exaggerated, it is useful to think about why this was a story at all and what we can learn from it. The coverage echoed the most Luddite reactions to Wikipedia and other ventures in creating knowledge in a collaborative, digital environment. In fact, soon after the Middlebury story was reported, one of my colleagues harrumphed, “Thank goodness someone is maintaining standards!” I asked what he meant, and he said that Wikipedia was prone to error. So are encyclopedias, I countered. So are refereed scholarly books. (Gasp!) He was surprised when I noted that several comparative studies have shown that errors in Wikipedia are not more frequent than those in comparable print sources. More to the point, in Wikipedia, errors can be corrected. The specific one cited by the Middlebury history department—an erroneous statement that Jesuits had supported a rebellion in seventeenth-century Japan—was amended in a matter of hours.

  That brings us to a second point. Wikipedia is not just an encyclopedia. It is a knowledge community, uniting anonymous readers all over the world who edit and correct grammar, style, interpretations, and facts. It is a community devoted to a common good—the life of the intellect. Isn’t that what we educators want to model for our students?

  Rather than banning Wikipedia, why not make studying what it does and does not do part of the research-and-methods portion of our courses? Instead of resorting to the “Delete” button for new forms of collaborative knowledge made possible by the Internet, why not make the practice of research in the digital age the object of study? That is already happening, of course, but we could do more. For example, some professors already ask students to pursue archival research for a paper and then to post their writing on a class wiki. It’s just another step to ask them to post their labors on Wikipedia, where they can learn to participate in a community of lifelong learners. That’s not as much a reach for students as it is for some of their professors.

  Most of the students who entered Middlebury last fall were born around 1988. They have grown up with new technology skills, new ways of finding information, and new modes of informal learning that are also intimately connected to their social lives. I recently spent time with a five-year-old who was consumed by Pokémon. His parents were alarmed by his obsession, although his father reluctantly admitted that, at the same age, he had known every dinosaur and could recite their names with the same passion that his son now has for the almost five hundred (and growing) Pokémon characters. I also was able to assure the parents that by mastering the game at the level he had, their son was actually mastering a nine-year-old’s reading vocabulary. He was also customizing his games with editing tools that I can only begin to manipulate, and doing so with creativity and remarkable manual dexterity. The students at Middlebury have grown up honing those skills. Don’t we want them to both mine the potential of such tools in their formal education and think critically about them? That would be far more productive than a knee-jerk “Delete.”

  I must admit I have an investment in this issue. A passionate one. I am on the advisory board of the John D. and Catherine T. MacArthur Foundation’s Digital Media and Learning initiative, a five-year, $50 million project started last year to study how digital technologies are changing all forms of learning, play, and social interaction. One focus of the initiative is research on ways that schools and colleges can be as lively and inspiring intellectually as are the Internet imaginations of our children. Grantees are working on such projects as learning games where young children create their own Frankensteins, then consider the ethics and science of their creations; other researchers are helping students develop a new civic awareness as they use three-dimensional virtual environments to create new worlds with new social rules. In the spirit of collaboration, the MacArthur program sponsors a blog, Spotlight, where visitors can interact with grantees (http://spotlight.macfound.org). In all the projects, the knowledge is shared, collaborative, cumulative. Like Wikipedia.

  I am also co-founder of a voluntary network of academics called HASTAC (http://www.hastac.org)—an unwieldy acronym that stands for Humanities, Arts, Science, and Technology Advanced Collaboratory, but everyone just calls it “haystack.” With my cofounder, David Theo Goldberg, I have recently posted the first draft of a paper, written for the MacArthur program, on “The Future of Learning Institutions in a Digital Age.” That paper is on a collaborative website (http://www.futureofthebook.org/HASTAC/learningreport/about) that allows anyone to edit it, make comments, and contri
bute examples of innovative work. The site is sponsored by the Institute for the Future of the Book, a group dedicated to investigating how intellectual discourse changes as it shifts from printed pages to networked screens. We are holding a series of public forums and, in the end, will synthesize responses and include, in a “Hall of Vision,” examples of the most inventive learning we have found in the country, learning that is collaborative and forward-looking. We will also include a “Hall of Shame,” for retrograde and unthinking reactions to new technologies. (I was delighted to learn that, despite media reports, Middlebury College won’t have to go there.)

  As a cultural historian and historian of technology, I find that I often go to Wikipedia for a quick and easy reference before heading into more scholarly depths. I’m often surprised at how sound and good a first source it is. Its problems have been well rehearsed in the media—to take a case that came recently to light, the way someone can create a persona as a scholar and contribute information under false pretenses. Some entries are bogged down in controversies, and some controversial figures (including scholars whose work I admire) are discussed in essays that are a mess of point and counterpoint. But I just looked up two well-known literary critics, Judith Butler and Fredric Jameson, on Wikipedia. Two months ago, when I first looked, the entries I found amounted to “idea assassinations” (if not outright character assassinations). But someone has been busy. The entries on both figures are much improved. I clicked on the editing history to see who had added what and why. I looked up a half hour later and realized I’d gotten lost in a trail of ideas about postmodernism and the Frankfurt School—when I had a deadline to meet. Isn’t that the fantasy of what the educated life is like?

  I also find that my book purchasing has probably increased threefold because of Wikipedia. I am often engaged by an entry, then I go to the discussion pages, and then I find myself caught up in debate among contributors. Pretty soon I am locating articles via Project Muse and 1-Click shopping for books on Amazon. Why not teach that way of using the resource to our students? Why rush to ban the single most impressive collaborative intellectual tool produced at least since the Oxford English Dictionary, which started when a nonacademic organization, the Philological Society, decided to enlist hundreds of volunteer readers to copy down unusual usages of so-called unregistered words?

  I urge readers to take the hubbub around Middlebury’s decision as an opportunity to engage students—and the country—in a substantive discussion of how we learn today, of how we make arguments from evidence, of how we extrapolate from discrete facts to theories and interpretations, and on what basis. Knowledge isn’t just information, and it isn’t just opinion. There are better and worse ways to reach conclusions, and complex reasons for how we arrive at them. The “discussion” section of Wikipedia is a great place to begin to consider some of the processes involved.

  When he responded to my letter of concern, Middlebury’s president also noted that “the history department’s stance is not shared by all Middlebury faculty, and in fact last night we held an open forum on the topic, in which a junior faculty member in the history department and a junior faculty member in our program in film and media culture presented opposing views and invited questions and comments from a large and interested audience.” He added that “the continuing evolution of new ways of sharing ideas and information will require that the academy continue to evolve as well in its understanding of how these technologies fit into our conception of scholarly discourse. We are pleased that Middlebury can take part in this important debate.”

  The Middlebury debate, by the way, already has a place on Wikipedia. Maybe that’s the right place for high schools and colleges to begin as they hold their own forums on the learning opportunities of our moment, and the best ways to use new tools, critically, conscientiously, and creatively.

  < Christine Rosen>

  virtual friendship and the new narcissism

  Originally published in The New Atlantis (Summer 2007).

  CHRISTINE ROSEN is senior editor of The New Atlantis: A Journal of Technology & Society. She is the author of Preaching Eugenics: Religious Leaders and the American Eugenics Movement (2004) and My Fundamentalist Education: A Memoir of a Divine Girlhood (2005). Since 1999, Mrs. Rosen has been an adjunct scholar at the American Enterprise Institute for Public Policy Research, where she has written about women and the economy, feminism, and women’s studies. Her commentaries and essays have appeared in The New York Times Magazine, The Wall Street Journal, The New Republic, Washington Post, The Weekly Standard, Commentary, Wilson Quarterly, and Policy Review. She earned a Ph.D. in history from Emory University in 1999.

  FOR CENTURIES, the rich and the powerful documented their existence and their status through painted portraits. A marker of wealth and a bid for immortality, portraits offer intriguing hints about the daily life of their subjects—professions, ambitions, attitudes, and, most important, social standing. Such portraits, as German art historian Hans Belting has argued, can be understood as “painted anthropology,” with much to teach us, both intentionally and unintentionally, about the culture in which they were created.

  Self-portraits can be especially instructive. By showing the artist both as he sees his true self and as he wishes to be seen, self-portraits can at once expose and obscure, clarify and distort. They offer opportunities for both self-expression and self-seeking. They can display egotism and modesty, self-aggrandizement and self-mockery.

  Today, our self-portraits are democratic and digital; they are crafted from pixels rather than paints. On social networking websites like MySpace and Facebook, our modern self-portraits feature background music, carefully manipulated photographs, stream-of-consciousness musings, and lists of our hobbies and friends. They are interactive, inviting viewers not merely to look at, but also to respond to, the life portrayed online. We create them to find friendship, love, and that ambiguous modern thing called connection. Like painters constantly retouching their work, we alter, update, and tweak our online self-portraits; but as digital objects they are far more ephemeral than oil on canvas. Vital statistics, glimpses of bare flesh, lists of favorite bands and favorite poems all clamor for our attention—and it is the timeless human desire for attention that emerges as the dominant theme of these vast virtual galleries.

  Although social networking sites are in their infancy, we are seeing their impact culturally: in language (where “to friend” is now a verb), in politics (where it is de rigueur for presidential aspirants to catalogue their virtues on MySpace), and on college campuses (where not using Facebook can be a social handicap). But we are only beginning to come to grips with the consequences of our use of these sites: for friendship, and for our notions of privacy, authenticity, community, and identity. As with any new technological advance, we must consider what type of behavior online social networking encourages. Does this technology, with its constant demands to collect (friends and status) and perform (by marketing ourselves), in some ways undermine our ability to attain what it promises—a surer sense of who we are and where we belong? The Delphic oracle’s guidance was know thyself. Today, in the world of online social networks, the oracle’s advice might be show thyself.

  >>> making connections

  The earliest online social networks were arguably the Bulletin Board Systems of the 1980s that let users post public messages, send and receive private messages, play games, and exchange software. Some of those BBSs, like The WELL (Whole Earth’Lectronic Link) that technologist Larry Brilliant and futurist Stewart Brand started in 1985, made the transition to the World Wide Web in the mid-1990s. (Now owned by Salon.com, The WELL boasts that it was “the primordial ooze where the online community movement was born.”) Other websites for community and connection emerged in the 1990s, including Classmates.com (1995), where users register by high school and year of graduation; Company of Friends, a business-oriented site founded in 1997; and Epinions, founded in 1999 to allow users to give their opinions about various consum
er products.

  A new generation of social networking websites appeared in 2002 with the launch of Friendster, whose founder, Jonathan Abrams, admitted that his main motivation for creating the site was to meet attractive women. Unlike previous online communities, which brought together anonymous strangers with shared interests, Friendster uses a model of social networking known as the “Circle of Friends” (developed by British computer scientist Jonathan Bishop), in which users invite friends and acquaintances—that is, people they already know and like—to join their network.

  Friendster was an immediate success, with millions of registered users by mid-2003. But technological glitches and poor management at the company allowed a new social networking site, MySpace, launched in 2003, quickly to surpass it. Originally started by musicians, MySpace has become a major venue for sharing music as well as videos and photos. It is now the behemoth of online social networking, with over 100 million registered users. Connection has become big business: In 2005, Rupert Murdoch’s News Corporation bought MySpace for $580 million.

  Besides MySpace and Friendster, the best-known social networking site is Facebook, launched in 2004. Originally restricted to college students, Facebook—which takes its name from the small photo albums that colleges once gave to incoming freshmen and faculty to help them cope with meeting so many new people—soon extended membership to high schoolers and is now open to anyone. Still, it is most popular among college students and recent college graduates, many of whom use the site as their primary method of communicating with one another. Millions of college students check their Facebook pages several times every day and spend hours sending and receiving messages, making appointments, getting updates on their friends’ activities, and learning about people they might recently have met or heard about.

 

‹ Prev