I suppose the albums themselves tell the tale. As most teenagers do, I tired of posing for family photos. At fourteen I identified the full complement of frauds I was participating in and began a process of elimination. Smiling for my mother was either first or second to monthly confession. But my disenchantment with being photographed persisted beyond the awkward years. It seemed the more images I consumed in a given day, the less readily I would sit for a picture. This was all easier to manage in the days when everyone leaned in over a festive groaning board a few times and called it a year.
The upshot is that every photograph taken of me as an adult that wasn’t an outright theft has involved capitulation. After a few such surrenders in my twenties—including a heroic submission to a professional sitting—I felt sufficient images were available to prove my existence and satisfy everyone seeking its reminder. Having fulfilled a reasonable quota, I effectively retired. Timing-wise, it was like Churchill hanging up his jowls while Poland fell. Just as I issued a blanket “I prefer not to,” the digital-camera bomb went off and the whole world mobilized for deployment.
By 2011, social-media sites were absorbing hundreds of millions of personal photographs each day. That these images are conceived and then live as a combination of light and code makes their numbers particularly unintelligible; it’s like counting cloud particles. In the new social reality, to refuse to be photographed is not only to be antisocial but in some sense to negate one’s own existence. The potential for every human being on earth to confront the existence of every other has arrived—George Eliot’s global moral economy realized as a data swap.
But there too Eliot’s remarks on the limits of human connection seem prescient. Eliot makes us aware, for instance, that when the artist Naumann is first captivated by Dorothea in the Vatican museum, she has just realized the grim reality of her marriage to Casaubon while lingering beside a Hellenistic sculpture of the ravishing Ariadne. Naumann gasses on to Ladislaw about capturing the “antique beauty” and “sensuous perfection” of a woman we know to be in the depth of a private misery.
But then a lot of people see a lot of things in Dorothea—Eliot herself compares her to several of God’s favorite saints—each impression a riff on the feminine ideal. Naumann may have rotten timing, Eliot suggests, but he’s not wrong. He’s just limited by human subjectivity, our gift and our curse; he sees what he sees. Having moved her heroine to tears, Eliot pauses to reflect on the weight of a few drops in a larger balance:
Some discouragement, some faintness of heart at the new real future which replaces the imaginary, is not unusual, and we do not expect people to be deeply moved by what is not unusual. That element of tragedy which lies in the very fact of frequency, has not yet wrought itself into the coarse emotion of mankind; and perhaps our frames could hardly bear much of it. If we had a keen vision and feeling of all ordinary human life, it would be like hearing the grass grow and the squirrel’s heart beat, and we should die of that roar which lies on the other side of silence. As it is, the quickest of us walk about well wadded with stupidity.
Often, scything through the streets of New York, the physical frustration of negotiating the city’s endless stream of bodies forms a kind of psychic buffer. It’s only on those rare occasions when you’re penetrated, in a glimpse, by the entirety—the person-ness—of each of those bodies, that the meaning of overwhelmed appears, at full gallop, on the horizon. “Perfect” perception would be the end of us. The handful of people on the planet with flawless, unexpurgated memories—who can tell you which episode of St. Elsewhere was airing on November 14, 1985, or the exact shitty thing that their sister said between the second and third course thirty-seven Easter dinners ago—strike me as tragic figures of particular modernity. Gathered for a recent television interview, a number of these individuals commiserated about the impossibility of maintaining close personal relationships, and the way that access to shallow detail across a deep stretch of time had cluttered their minds to the point of dysfunction. Perfect recall is the enemy of memory, which relies for its particular textures on the art—to say nothing of the mercy—of forgetting.
The brains of those prodigal rememberers have been likened to computers. But all technology aspires to human ideals or ideologies—so that to have the memory of a machine is to be superhuman, and to appear photoshopped is to achieve perfection. And the Internet, in its infinitude, only fulfills the modern desire for mastery over time and space. It was built in our image, in other words, a reality in which the operational imperative of “saving time” forms a paradox: driven on the one hand by the wish to make minor and major interactions, tasks, and consumptions happen as quickly as they possibly can, the Internet is also designed to literally save each second as it passes, preserving every hour in coded amber with a diligence that might seem sentimental were it not so straight-up fascist.
* * *
If anything could, shouldn’t images rescue us from such a fate? Is there not still an essential purity to what they show us about the world? In the United States, the first version of what became a clichéd allusion to the photographic image’s enviable clarity appeared in a 1914 New York Times ad for real estate: “A look is worth a thousand words.” In 1921, Printers’ Ink writer Frederick Barnard repeated the claim as an advertising tip, then used it again in the same magazine six years later, replacing look with picture, thus shifting the burden of power from the observer to the thing being observed.
A flooded market has warped and depreciated that power. To make images of food and faces and bikini-clad flanks stand out, advertisers digitally whittle and polish to the point that presenting them as photographs—that is to say, as a reflection of reality—constitutes a kind of fraud. In the women’s beauty and lifestyle industries especially, companies like H&M are forgoing the imperfect, inconsistent human form entirely, generating their models from pixelated scratch. Photographic scrutiny has become too intense for even the most beautiful bodies to bear, and a totalizing sea of images has produced a standard so strict it verges on uncanny. Representing this hyperreality, paradoxically, requires a digital paintbrush and a neo-mannerist take on human proportion, not an expensive lens and good light.
Civilian image-makers, having also felt the burn of ubiquity, seek new ways to set their photographic lives apart from the amalgamating effects of social media. The first and so far most ingenious attempt to capitalize on this anxiety was the 2009 debut of a camera application called Hipstamatic (slogan: “Digital photography never looked so analog”). The equivalent of an aggressive lens filter, Hipstamatic is designed to imbue the disposable digital image with the qualities of time and memory we now associate with earlier photographic eras. Hipstamatic spokesman Mario Estrada has admitted that technically the app makes crummy camera-phone images look even worse. But it’s a gorgeous corrosion, Estrada claims, and the images are now crummy “in the most beautiful way.”
Different settings yield different patinas and color schemes, all meant to mimic both the limits of early mass-market cameras (the designers claim to have nicked the name Hipstamatic from a disposable camera manufactured in the early 1980s) and the numinous effects of time on print photographs. A century or so into photographic history, those effects were already apparent. Black-and-white photos from my father’s childhood seemed romantically careworn to me as a kid—the paper albums, the adhesive corners that protect the photo and hold it in place, the white-picket frame built into every image. Because my grandfather was gadget-prone, all of my father’s early home movies are also shot in crisp, color-saturated 16 mm. Scenes from my own childhood were recorded on fuzzy Super 8, then transferred to VHS in the mid-1990s. By the mid-2000s, that transfer had been transferred to DVD, so that now the white-lettered words PLAY and PAUSE occasionally appear in the upper-right corner—ghost traces, along with a damnably lachrymose pan-flute sound track, of the VHS layover.
While chaptering through the DVD with my cousin and her twentysomething boyfriend recently, searching for rare footage of my c
ousin’s long-deceased mother, I was dispirited by the third-generation hemorrhaging of the images I remember chiefly as scenes I’ve watched before. But as my cousin and I squinted for signs of ourselves and our loved ones, her boyfriend fell into an aesthetic swoon. For him the failing images were incredible—realer than real, more authentic for their desiccated veils. He’s a Hipstamatic fan, naturally, and therefore a connoisseur of the distressed look that, until our home movies, he knew chiefly by facsimile. I found myself more moved by his passion for the texture of the images than the grainy, pan-fluty hologram of my own first, thundering footsteps across the old living-room floor.
Camera apps became the center of a mild controversy in early 2011, when a photo taken with an iPhone using the Hipstamatic app won third prize in an international competition. The photo, taken by the New York Times photographer Damon Winter, depicts two American soldiers on an Afghan patrol: the helmet of the closer one looms in the center of the frame; the farther soldier is poised to return fire and appears above the nearer one’s right shoulder, pointing his M16 into a vale of trees. The composition is striking, as is the color scheme, which blends the soldiers’ camouflage and the surrounding flora into a corona of yellow and green. As much as the basics, however, the image’s success relies on its aura: the vignetting effect responsible for the color wash is designed to give the image an antique-y look—a look, specifically, now associated with the war photography that came out of Vietnam.
The debate over Winter’s award was scattered. Some picketed the fading line that distinguishes “professional” photographers from everyday snappers. Others discussed the ethics of using an aesthetic that references a past aesthetic to capture the “reality” of a given situation—in this case an American occupation. Clicking through Winter’s Hipstamatic portfolio of his time in Afghanistan, I remembered my father’s story about having to stop taking meals, in the late sixties, with the television news on. This was when my parents lived in New York, in the time before the draft notice arrived, which is to say before they hightailed it back to Canada. They’d never seen such carnage, real-life carnage, on any screen. Images like that have been withheld this time around, adding a darker sheen of irony to any appropriation, in the coverage of this war, of Vietnam’s brutal confrontations. The ambivalence surrounding these particular conflicts seems to have driven an aesthetic that is both ultramodern and explicitly aligned with established war imagery. Did Winter’s antiqued images seem more authentic because the Vietnam War somehow feels more real?
In Winter’s defense of his Hipstamatic photos, published by the Times in the wake of the controversy over his win, he notes the public’s naïveté about how images are made. For him the iPhone is just another tool in the photographer’s ongoing excavation of the world—in this case useful precisely because it didn’t spook the soldiers, who all carry camera phones themselves. Winter hoped representing American soldiers with the recognizably casual intimacy of a lazy-Sunday iPhone shoot with one’s cat might have a de-anonymizing effect, rescuing the subjects from generic GI constraints even as it surrounds them with the familiar tint of the past. By bringing war into the aesthetic world we live in, the medium itself could help make us care. And, anyway, the iPhone seemed well suited to an atmosphere the photographer described elsewhere as “more like summer camp with guns” than a military operation.
Winter’s images of soldiers horse-playing and sleeping in a pile do have a dreamy, summer-campy quality. Looking at them, arguments about declining standards and digital parameters feel beside the point: if there’s peril in app-driven war photography, it involves the gentle death grip of nostalgia. If the compulsion to mediate our lives suggests a pathological remove from the present, Hipstamatic’s insta-pastiche beggars Roland Barthes’s belief that photography is important foremost because it helps us believe in the past. The present moment, ostensibly frozen by a camera, can now be distended to suggest any number of past realities. That distension is actually built into the thing taking the pictures, further layering the world with synthetic meaning in real time and fully, finally confusing the quality of authenticity that has obsessed our relationship to the image from the start.
The Bush White House claimed censoring images of military injuries and casualties was a matter of respect. It’s hard not to wonder, ten years later, what difference it might have made. It’s harder to grasp how fully the video games, war movies, torture-driven horror films, and Internet snuff buffets have informed and maybe even sated our curiosity about the realities of combat. Those doing the fighting deal with the far side of that influence. Again and again, in accounts of the Iraq and the Afghanistan wars, the disillusionment of young military subjects is defined by their media-fostered expectations.
The success of first-person shooter video games like Call of Duty inspired the military to step up their use of virtual-reality games in preparing soldiers for the experience of war. A 2008 MIT study concluded that America’s Army, a Call of Duty knockoff developed by the military, is their most successful recruiting tool yet. Some say this kind of training is more about desensitizing soldiers to death and violence—quite a thought when you consider the preponderantly nonmilitary domain of video gaming. Older brass have noted that while the young, video-game-weaned recruits have startling console dexterity and hand-eye reflexes, they are less able—even unable—to distinguish between what’s real and what’s not. If there’s any way to explain the glazed, incongruous glee of the soldiers in the Abu Ghraib images, it may have to do with this sense of irreality, of being there but not there—a feeling whose natural accelerant and antidote is the camera.
The initial response to the Abu Ghraib images was pretty universal: revulsion. Revulsion derived in part from the fact that these were not secondhand stories leaked by a mole or an intrepid reporter, but crimes that American soldiers documented themselves, as they happened, like for fun. As Susan Sontag noted at the time, not even the Nazis—obsessive archivists of their own atrocities—were known to cram a thumbs-up into the frame.
Sontag also used the Abu Ghraib scandal to point out that the purpose of every digital image is tied to its own dissemination, something reconfirmed by every new viral cell-phone video of a dictator’s grisly lynching, or violation of an anonymous young girl. My mind goes blank when I hear stories about kids who grew up in a digital camera culture documenting their felonies and uploading them to Facebook. The horror is demagnetizing. The only sense to be made is surely tied to our desperation for the crisp sting of reality in an increasingly padded, prismatic world. If we’ve reached the point where what is not photographed does not count as “real,” then in some situations the paradox may be that the camera is introduced to somehow complete or verify a moment that felt too surreal, as though it weren’t really happening. And yet the most pervasive reality to emerge from camera culture meets only the most basic—which is to say legal—parameters. Pics, as they say, or it didn’t happen.
* * *
In their respective 1970s meditations on photography, Roland Barthes and Susan Sontag ground certain of their arguments in the physical nature of the photograph. Barthes seemed to revel more in what photography—“a carnal medium, a skin I share with anyone who has been photographed”—could do than what it might; Sontag contrasted photography favorably with the chaos of television, “a stream of underselected images, each of which cancels its predecessor.” Barthes preferred the still to the moving image because it asked and allowed for more of us—only subjectivity can develop, create, complete, the well-selected image. Sontag felt the physical fact of photographs was the source of both their power and their manageability, making them more given to a governing “ecology.”
Although that hope seems far off indeed thirty years and untold trillions of images later, a basic question behind it persists: Is there anything that should not be photographed?
Celebrity-photographer Eve Arnold has described the photographer’s ecological responsibility as a matter of “gatekeeping.” The
death of a famous subject—and the subsequent surge to their living image—tests the photographer. Arnold calls Bert Stern’s incessant republication of the photographs he took of Marilyn Monroe (whom she also photographed), known as the Last Sitting, a betrayal, especially given evidence of the actress’s attempt to destroy a whack of them. But there is no privacy for the dead, and perhaps only the simulacrum of it for the living.
“As everyone knows who has ever heard a piece of gossip,” wrote Janet Malcolm in “The Silent Woman,” her 1993 New Yorker serial concerning the embattled literary estate of Sylvia Plath, the role of Plath’s husband, Ted Hughes, in shaping her legacy, and the controversy surrounding the lengthening queue of Plath biographies, “we do not ‘own’ the facts of our lives at all.
This ownership passes out of our hands at birth, at the moment we are first observed. The organs of publicity that have proliferated in our time are only an extension and a magnification of society’s fundamental and incorrigible nosiness. Our business is everybody’s business, should anybody wish to make it so. The concept of privacy is a sort of screen to hide the fact that almost none is possible in a social universe. In any struggle between the public’s inviolable right to be diverted and an individual’s wish to be left alone, the public almost always prevails. After we are dead, the pretense that we may somehow be protected against the world’s careless malice is abandoned. The branch of the law that putatively protects our good name against libel and slander withdraws from us indifferently.
In this light, Anthony Summers’s publication of a photo of Monroe’s corpse in his 1996 biography appears inevitable; the public prevailed. It was, Arnold felt, “the ultimate in horror, to me, of what can happen to a picture.”
Today, it would seem, everything should be photographed, and everything that is photographed should be seen. It is a matter of maintaining our new social ecology; to resist is futile, as is the expectation of “privacy” as we have conceived of it since royal copulation as a spectator sport fell out of favor. What is it, we ask of the mother who complains about her child’s photo being posted on another parent’s social-media page, that you have to hide? What exactly are you worried about? Anyone who has badgered a stranger or even a good friend to stop posting his picture on the Internet has discovered the paradox of this new world of individuals living in a country of images: mutually assured solipsism means nothing is sacred. You’d think it would have created a kingdom of libertarians, stone-walled mini-fiefdoms as far as the eye can see. Instead we have chosen to believe that a self-interested society can run on the pretense of sharing and make a manicured production of living open and expansively represented lives online.
This Is Running for Your Life Page 22