Book Read Free

The Digital Divide

Page 30

by Mark Bauerlein


  Tim O’Reilly is more explicit about this commercial democracy, if not all that comprehensible. O’Reilly is the head of an Internet company called O’Reilly Media, and he is generally considered the originator of 2.0. To begin with, O’Reilly has a somewhat different view of the blogosphere from Rosen:The blogosphere is the equivalent of constant mental chatter in the forebrain, the voice we hear in all of our heads. It may not reflect the deep structure of the brain, which is often unconscious, but is instead the equivalent of conscious thought. And as a reflection of conscious thought and attention, the blogosphere has begun to have a powerful effect.

  “It may not reflect the deep structure of the brain, which is often unconscious, but is instead the equivalent of conscious thought.” If your toaster could write a sentence, it would write one just like that. O’Reilly goes on:First, because search engines use link structure to help predict useful pages, bloggers, as the most prolific and timely linkers, have a disproportionate role in shaping search engine results. Second, because the blogging community is so highly self-referential, bloggers paying attention to other bloggers magnifies their visibility and power . . . like Wikipedia, blogging harnesses collective intelligence as a kind of filter . . . much as PageRank produces better results than analysis of any individual document, the collective attention of the blogosphere selects for value.

  PageRank is Google’s algorithm—its mathematical formula—for ranking search results. This is another contribution, according to its touters, to access to information, and therefore yet another boon to “democracy.” PageRank keeps track of websites that are the most linked to—that are the most popular. It is, in fact, the gold standard of popularity in Web culture. What O’Reilly is saying, in plain English, is that the more people blog, and the more blogs link to each other, the more highly ranked the most popular blogs will be. When O’Reilly writes in his appliance-like manner that “the collective attention of the blogosphere selects for value,” he simply means that where the most bloggers go, people who are interested in general trends—businessmen and marketing experts, for instance—will follow. “Value” in O’Reilly’s sense is synonymous with popularity.

  In this strange, new upside-down world, words like “democracy” and “freedom” have lost their meaning. They serve only to repel criticism of what they have come to mean, even when that criticism is made in the name of democracy and freedom.

  >>> through the looking glass

  What would you have said if I had told you, ten years ago, that there would soon come a time when anyone with something to say, no matter how vulgar, abusive, or even slanderous, would be able to transmit it in print to millions of people? Anonymously. And with impunity.

  How would you have reacted if I had said that more drastic social and cultural changes were afoot? To wit: Powerful and seasoned newspaper editors cowering at the feet of two obscure and unaccomplished twentysomethings, terrified that this unassuming pair might call them “douchebags” in a new gossip sheet called Gawker. An obscure paralegal in Sacramento, California, who often makes glaring grammatical mistakes on his blog, becoming one of the most feared people in American literary life, on account of his ability to deride and insult literary figures. High school kids called “administrators” editing entries in a public encyclopedia, entries that anyone, using an alias, could change to read in any way he or she wanted. Writers distributing their thoughts to great numbers of people without bothering to care about the truth or accuracy of what they were writing; writers who could go back and change what they wrote if they were challenged—or even delete it, so that no record of their having written it would exist.

  You would have laughed at me, I’m sure. Maybe you would have thought that I was purposefully and ludicrously evoking Stalin, who rewrote history, made anonymous accusations, hired and elevated hacks and phonies, ruined reputations at will, and airbrushed suddenly unwanted associates out of documents and photographs. You might have said, What point are you trying to make by saying that our American democracy is moving toward a type of Stalinism? How trite, to compare American democracy to its longtime nemesis using crude inversions. Are you some sort of throwback to the anti-American New Left?

  And what if I had, to your great irritation, persisted and told you that anyone who tried to criticize one or another aspect of this situation would immediately be accused of being antidemocratic, elitist, threatened by change, and pathetically behind the times? If I had told you that in fact, because of these risks, few people ever did offer any criticism? The gospel of popularity had reached such an extent in this upside-down world that everyone, even powerful, distinguished people, cringed at the prospect of being publicly disliked.

  What I’ve been describing is the surreal world of Web 2.0, where the rhetoric of democracy, freedom, and access is often a fig leaf for antidemocratic and coercive rhetoric; where commercial ambitions dress up in the sheep’s clothing of humanistic values; and where, ironically, technology has turned back the clock from disinterested enjoyment of high and popular art to a primitive culture of crude, grasping self-interest. And yet these drastic transformations are difficult to perceive and make sense of. The Internet is a parallel universe that rarely intersects with other spheres of life outside its defensive parameters.

  Here is John Battelle, a co-founder of Wired magazine, in his book, The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture. Like Toffler and Gladwell, Battelle is all for bringing leisure time into the marketplace:On the Internet, it can be argued, all intent is commercial in one way or another, for your very attention is valuable to someone, even if you’re simply researching your grandmother’s genealogy, or reading up on a rare species of dolphin. Chances are you’ll see plenty of advertisements along the way, and those links are the gold from which search companies spin their fabled profits.

  Battelle wants to press home the importance of multiple searches to advertisers. He uses the following quotation to make his point:Thorstein Veblen, the early-twentieth-century thinker who coined the term “conspicuous consumption,” once quipped, “The outcome of any serious research can only be to make two questions grow where only one grew before” . . . In fact, Pew research shows that the average number of searches per visit to an engine [that is, a search engine, like Google] is nearly five . . . This copious diversity drives not only the complexity of the search itself, but also the robustness of the advertising model that supports it.

  But Veblen was talking about the humanistic value of research, not the commercial value of a “search”! He was saying that the world was ultimately mysterious and unfathomable, and that therefore the quest for knowledge had no terminus—that the disinterested, endless quest for knowledge was an end in itself. Battelle can only understand Veblen in the context of commerce and the Web.

  Which context is often so unreal, yet so confident in its unreality, that it has the very real effect of making any criticism of it seem absurd.

  That’s what Alice Mathias, a senior at Dartmouth College, discovered. On a blog in the New York Times called “The Graduates: Eight College Seniors Face the Future,” Mathias contributed a dry, witty, yet openhearted column titled “Love in the Digital Age.” She concluded it like this:For example, Dartmouth students have recently had to deal with the construction of the website boredatbaker.com (which has cousins at the other Ivies, the Massachusetts Institute of Technology, New York University and Stanford). Intended as a community tool, this website has mutated into a forum for the anonymous publication of very personal attacks on students who must try their best not to be emotionally affected when people publicly question their sexuality, comment on their appearance and speculate about their value as humans.

  In anonymous Internet attacks, people can say things they would never mention aloud while looking their target in the eye. No one need take any personal responsibility. The victims of these unfortunate manifestations of free speech must suspend their emotions and try to trust that peop
le around them (including love interests) aren’t the ones who are writing or consuming this stuff. The safest thing to do in our boredatbaker-shadowed community is to be emotionally isolated from everyone until graduation brings escape.

  College students used to be the active arm of society’s conscience. The ones, like Mathias, with the most sensitive consciences often protested war, racial bias, inequitable social policies. If an instance of corruption or injustice occurred in the town or city where they went to school, they often took to the streets to demonstrate or to march with the local townspeople or to stand on a picket line. Or maybe they just edited a mordantly honest literary magazine. Now they tremble helplessly before the Internet’s Alice-in-Wonderland, truth-eliding, boundary-busting juggernaut.

  What can they do? The language of protest college students once used—democracy, freedom, power to the people, revolution—has been taken over by the very forces that are invading and bruising their inner lives. The people who run boredatbaker.com would no doubt respond to criticism of their anonymous character assassinations by echoing Lawrence Lessig, Jay Rosen, and others and crying “free speech” and “democracy” and “don’t fight the future.” Graduation probably won’t bring escape, either. At Gawker.com, a Manhattan-based website that makes random attacks on media figures, a site run by people you’ve never heard of—who might just as well be anonymous—you even have the opportunity to buy the official Gawker T-shirt, which has the word “Douché,” referring to a favorite Gawker insult, printed on the front. Incredibly, the high school stigma of unpopularity has become so great that the accomplished adults of the New York media world live in fear of this adolescent silliness.

  In this upside-down new world, student rebellion would have the appearance of reactionary resentment. But then, in this new upside-down world, politically active students appear in long threads on political blogs as “hits” rather than as real bodies protesting in the streets.

  < William Deresiewicz >

  the end of solitude

  Originally published in The Chronicle of Higher Education (January 30, 2009).

  WILLIAM DERESIEWICZ is an author, essayist, and book critic. Deresiewicz is a contributing writer for The Nation and a contributing editor for The New Republic. His work has also appeared in The New York Times Book Review, Bookforum , and elsewhere. A Jane Austen Education: How Six Novels Taught Me About Love, Friendship, and the Things That Really Matter was published in April. More information at www.billderesiewicz.com.

  WHAT DOES THE contemporary self want? The camera has created a culture of celebrity; the computer is creating a culture of connectivity. As the two technologies converge—broadband tipping the Web from text to image, social-networking sites spreading the mesh of interconnection ever wider—the two cultures betray a common impulse. Celebrity and connectivity are both ways of becoming known. This is what the contemporary self wants. It wants to be recognized, wants to be connected. It wants to be visible. If not to the millions, on Survivor or Oprah, then to the hundreds, on Twitter or Facebook. This is the quality that validates us; this is how we become real to ourselves— by being seen by others. The great contemporary terror is anonymity. If Lionel Trilling was right, if the property that grounded the self, in Romanticism, was sincerity, and in modernism it was authenticity, then in postmodernism it is visibility.

  So we live exclusively in relation to others, and what disappears from our lives is solitude. Technology is taking away our privacy and our concentration, but it is also taking away our ability to be alone. Though I shouldn’t say taking away. We are doing this to ourselves; we are discarding these riches as fast as we can. I was told by one of her older relatives that a teenager I know had sent three thousand text messages one recent month. That’s one hundred a day, or about one every ten waking minutes, morning, noon, and night, weekdays and weekends, class time, lunch time, homework time, and toothbrushing time. So on average, she’s never alone for more than ten minutes at once. Which means, she’s never alone.

  I once asked my students about the place that solitude has in their lives. One of them admitted that she finds the prospect of being alone so unsettling that she’ll sit with a friend even when she has a paper to write. Another said, why would anyone want to be alone?

  To that remarkable question, history offers a number of answers. Man may be a social animal, but solitude has traditionally been a societal value. In particular, the act of being alone has been understood as an essential dimension of religious experience, albeit one restricted to a self-selected few. Through the solitude of rare spirits, the collective renews its relationship with divinity. The prophet and the hermit, the sadhu and the yogi, pursue their vision quests, invite their trances, in desert or forest or cave. For the still, small voice speaks only in silence. Social life is a bustle of petty concerns, a jostle of quotidian interests, and religious institutions are no exception. You cannot hear God when people are chattering at you, and the divine word, their pretensions notwithstanding, demurs at descending on the monarch and the priest. Communal experience is the human norm, but the solitary encounter with God is the egregious act that refreshes that norm. (Egregious, for no man is a prophet in his own land. Tiresias was reviled before he was vindicated, Teresa interrogated before she was canonized.) Religious solitude is a kind of self-correcting social mechanism, a way of burning out the underbrush of moral habit and spiritual custom. The seer returns with new tablets or new dances, his face bright with the old truth.

  Like other religious values, solitude was democratized by the Reformation and secularized by Romanticism. In Marilynne Robinson’s interpretation, Calvinism created the modern self by focusing the soul inward, leaving it to encounter God, like a prophet of old, in “profound isolation.” To her enumeration of Calvin, Marguerite de Navarre, and Milton as pioneering early-modern selves we can add Montaigne, Hamlet, and even Don Quixote. The last figure alerts us to reading’s essential role in this transformation, the printing press serving an analogous function in the sixteenth and subsequent centuries to that of television and the Internet in our own. Reading, as Robinson puts it, “is an act of great inwardness and subjectivity.” “The soul encountered itself in response to a text, first Genesis or Matthew and then Paradise Lost or Leaves of Grass.” With Protestantism and printing, the quest for the divine voice became available to, even incumbent upon, everyone.

  But it is with Romanticism that solitude achieved its greatest cultural salience, becoming both literal and literary. Protestant solitude is still only figurative. Rousseau and Wordsworth made it physical. The self was now encountered not in God but in Nature, and to encounter Nature one had to go to it. And go to it with a special sensibility: The poet displaced the saint as social seer and cultural model. But because Romanticism also inherited the eighteenth-century idea of social sympathy, Romantic solitude existed in a dialectical relationship with sociability—if less for Rousseau and still less for Thoreau, the most famous solitary of all, then certainly for Wordsworth, Melville, Whitman, and many others. For Emerson, “the soul environs itself with friends, that it may enter into a grander self-acquaintance or solitude; and it goes alone, for a season, that it may exalt its conversation or society.” The Romantic practice of solitude is neatly captured by Trilling’s “sincerity”: the belief that the self is validated by a congruity of public appearance and private essence, one that stabilizes its relationship with both itself and others. Especially, as Emerson suggests, one beloved other. Hence the famous Romantic friendship pairs: Goethe and Schiller, Wordsworth and Coleridge, Hawthorne and Melville.

  Modernism decoupled this dialectic. Its notion of solitude was harsher, more adversarial, more isolating. As a model of the self and its interactions, Hume’s social sympathy gave way to Pater’s thick wall of personality and Freud’s narcissism—the sense that the soul, self-enclosed and inaccessible to others, can’t choose but to be alone. With exceptions, like Woolf, the modernists fought shy of friendship. Joyce and Proust disparaged
it; D. H. Lawrence was wary of it; the modernist friendship pairs—Conrad and Ford, Eliot and Pound, Hemingway and Fitzgerald—were altogether cooler than their Romantic counterparts. The world was now understood as an assault on the self, and with good reason.

  The Romantic ideal of solitude developed in part as a reaction to the emergence of the modern city. In modernism, the city is not only more menacing than ever; it has become inescapable, a labyrinth: Eliot’s London, Joyce’s Dublin. The mob, the human mass, presses in. Hell is other people. The soul is forced back into itself—hence the development of a more austere, more embattled form of self-validation, Trilling’s “authenticity,” where the essential relationship is only with oneself. (Just as there are few good friendships in modernism, so are there few good marriages.) Solitude becomes, more than ever, the arena of heroic self-discovery, a voyage through interior realms made vast and terrifying by Nietzschean and Freudian insights. To achieve authenticity is to look upon these visions without flinching; Trilling’s exemplar here is Kurtz. Protestant self-examination becomes Freudian analysis, and the culture hero, once a prophet of God and then a poet of Nature, is now a novelist of self—a Dostoyevsky, a Joyce, a Proust.

  But we no longer live in the modernist city, and our great fear is not submersion by the mass but isolation from the herd. Urbanization gave way to suburbanization, and with it the universal threat of loneliness. What technologies of transportation exacerbated—we could live farther and farther apart—technologies of communication redressed—we could bring ourselves closer and closer together. Or at least, so we have imagined. The first of these technologies, the first simulacrum of proximity, was the telephone. “Reach out and touch someone.” But through the ’70s and ’80s, our isolation grew. Suburbs, sprawling ever farther, became exurbs. Families grew smaller or splintered apart, mothers left the home to work. The electronic hearth became the television in every room. Even in childhood, certainly in adolescence, we were each trapped inside our own cocoon. Soaring crime rates, and even more sharply escalating rates of moral panic, pulled children off the streets. The idea that you could go outside and run around the neighborhood with your friends, once unquestionable, has now become unthinkable. The child who grew up between the world wars as part of an extended family within a tight-knit urban community became the grandparent of a kid who sat alone in front of a big television, in a big house, on a big lot. We were lost in space.

 

‹ Prev