by Dale Peck
Despite these obstacles, I managed to discuss the murders with a few men who, let’s say in deference to Graham and Det. Chief Superintendent John, were at greater risk than “the general public” of becoming the killer’s next victims. I spoke to these men not as a writer but as someone (notwithstanding the callowness and bluster of his journal) trying to balance a desire to get laid with the possibility that the man who came up to me in the Backstreet or the Block with a loutish smile on his face and a pair of handcuffs dangling from his belt might be wondering what I looked like, not with a gag in my mouth, but my own severed genitals—which, not surprisingly, is what most of the people I spoke to were also thinking. In the course of these slightly surreal nights (during one of which I met the man I ended up dating for the next three years) I was told all sorts of things, none of which I wrote down and most of which I forgot almost immediately. But the one thing I do remember someone telling me, a man named Andy whom I met at the Block, was that he didn’t pay much heed to what the tabloids printed about this serial killer because there had been another serial killer loose in London for twelve years, and they didn’t have anything sensible to write about that one either. It didn’t really matter to the straight press how we died, Andy said. It just mattered that we were dying. This was not, I think, too harsh a judgment—not for 1993, when queers were still grappling with “the murderous representations of homosexuals unleashed and ‘legitimized’ by AIDS,” as Leo Bersani had put it back in 1987, and possibly not even for 2013, when, as I write, the New York City Police Department is investigating the January 2 murder of David Rangel, the January 28 murder of Charles Romo, and the February 9 murder of Joseph Benzinger, all three of whom appear to have been killed by men they picked up for sex, not to mention the May 17 murder of Mark Carson by Elliot Morales, who, according to witnesses, shouted antigay slurs at Carson on a West Village street before shooting him point-blank in the face, or the ever-increasing cache of YouTube videos of the assaults and murders of gay men in Russia and Nigeria. And if you think it is too harsh, well, fuck you too.
10
“The imagination,” James Baldwin wrote in The Evidence of Things Not Seen, his take on Wayne Williams and the Atlanta child murders of 1979–1981, “is poorly equipped to accommodate an action in which one, instinctively, recognizes the orgasmic release of self-hatred.” Poorly equipped maybe, but fascinated, as evidenced by the ever-proliferating biographies and biopics about serial killers—the crime-scene pictures, the reenactments, the interviews from prison, the tearful, highly compensated testimonials from survivors and relatives and sometimes the killers themselves—not to mention the literary musings of a Dennis Cooper or James Ellroy or Bret Easton Ellis, which, though every bit as visceral as their pulp counterparts, at least dispense with any show of insight and portray their protagonists as manifestations of a violence whose cause is at once external and internal, cultural as well as psychological, and, ultimately, as bland as it is erotic. Which is to say: at some point during the 1970s or 1980s serial murder became a spectator sport, and gay serial killers, with their clueless wives and teenage accomplices, their necrophilia and cannibalism, their clown paintings and torture chambers, their inchoate teenage victims running through impoverished urban neighborhoods “buck naked … beaten up … very bruised,” only to be returned to their murderer by a policeman who couldn’t “do anything about somebody’s sexual preferences in life,” became America’s favorite gladiators. For homophobes, gay serial murder was a perfect ouroboros: men like Colin Ireland and Jeffrey Dahmer and the as-yet unidentified Richard Rogers confirmed their worst suspicions about faggots while simultaneously reassuring them that homosexuality was a self-annihilating phenomenon.
And at the same time that gay serial killers were becoming fixtures on televisions and magazine covers and book jackets, PWAs were also becoming increasingly visible in the American media—on talk shows and in movies and a spate of books so numerous that they merited their own category at the Lambda Literary Awards. But despite the obvious sympathy with which most of these narratives were pitched, I viewed them with the same suspicion I viewed media coverage of gay serial killers—as, if not manifestations of a desire to see dead queers, then aesthetic accomplices of same. This was true whether the subject of the story was homosexual or heterosexual. In the former case, AIDS functioned as a reminder that gay men were being punished for the crime of their sexual activities, as when, in Philadelphia, Tom Hanks’s character confesses that he’d had sex in a porn theater, not a thousand times, not a hundred times, not a dozen times, but “once” (so innocent is Hanks’s Andrew Beckett that the only thing he says to his partner as he enters a private booth—besides his name, which is pretty much the last thing someone says at that point in the game—is “Now what do we do?”). If, on the other hand, the subject was heterosexual, it served to incite the “general public” ’s hatred of gay men, without whose promiscuity the epidemic would not exist and, more to the point, wouldn’t threaten innocents: faithful wives, hemophiliac children, the unsuspecting Samaritans of the health care profession. As Alison Gertz, who believed she contracted HIV from a “bisexual” man, told readers of the New York Times, “I’m heterosexual, and it only took one time for me.” I don’t mean to suggest that Gertz or the makers of Philadelphia bore any expressed or even unconscious animosity toward gay men, only that the context in which their stories were presented couldn’t help but reinforce the prejudices many people had about the things gay men did, and the consequences of those actions.
Because AIDS, as Simon Watney had told us in Policing Desire, in addition to being a medical catastrophe, was “a crisis of representation”:
From very early on in the history of the epidemic, Aids has been mobilised to a prior agenda of issues concerning the kind of society we wish to inhabit. These include most of the shibboleths of contemporary “familial” politics, including anti-abortion and anti-gay positions. It is therefore impossible to isolate the representation of Aids, or campaigns on behalf of people with Aids, from this contingent set of values and debates.
Leo Bersani, in “Is the Rectum a Grave?” pushed Watney’s point further: “The persecuting of children of heterosexuals with AIDS (or who have tested positive for HIV) is particularly striking in view of the popular description of such people as ‘innocent victims.’ It is as if gay men’s ‘guilt’ were the real agent of infection.” The children Bersani referred to in this instance were Ricky, Robert, and Randy Ray, three HIV-positive brothers who lived in Arcadia, Florida, and who had been kicked out of elementary school because it was feared they might infect fellow students or teachers. On August 5, 1987, a federal court ordered the boys’ school to reinstate them, but, far from normalizing the Rays’ life, the judgment unleashed a torrent of death threats against the family, culminating in the August 28, 1987 torching of their home, at which point the Rays fled to Sarasota. “If the good citizens of Arcadia, Florida, could chase from their midst an average, law-abiding family,” Bersani wrote, “it is, I would suggest, because in looking at three hemophiliac children they may have seen—that is, unconsciously represented—the infinitely more seductive and intolerable image of a grown man, legs high in the air, unable to refuse the suicidal ecstasy of being a woman.”
Larry Kramer had been more blunt on the subject in “1,112 and Counting”: “All gays are blamed for John Gacy, the North American Man/Boy Love Association, and AIDS.” Maybe this sublimated rage faded over time or maybe we just stopped talking about it, but by the mid-nineties it had been supplanted or supplemented by the entropic psychological effect of what had by then become the standard formula with which people with AIDS were depicted in books and movies and on the news—the “normal” life, the “nagging” cough, the “shocking” diagnosis, the “inexorable” decline—a biographical shorthand as anodyne and irrelevant as Thomas Mulcahy’s flowerbeds and Anthony Marrero’s Phillies tryout and Michael Sakara’s rendition of “I’ll Be Seeing You.” Just enough informati
on was given to arouse the audience’s sympathy, not in an individual but in a human outline who, but for a few details, could have been “you,” could have been “me.” The only real variable was whether the subject died or lived at story’s end, although in either case the experience was, to reclaim a word from Camille Paglia (assuming anyone remembers Camille Paglia), chthonic, and cathartic as well. And as Bertolt Brecht pointed out nearly a century ago (does anyone remember Bertolt Brecht?), the effect of catharsis—indeed, its very goal, as reflected in its original Greek meaning of purgation or purification—is to cleanse the psyche of extremes of emotion, thus rendering it complacent. An empathic connection with a person or perhaps even a situation or cause is first kindled and then expelled (in the case of the AIDS narrative, the sexual overtones of Brecht’s metaphor are mordantly resonant), and the resulting psychic lethargy works against moral insight and moral action and in favor of the status quo.
As someone whose first novel had been one of those ever-proliferating AIDS narratives—as someone, moreover, who left ACT UP to devote himself to writing and had unknowingly profited from speculation that he might be HIV-positive—I felt complicit in what I saw as the normalizing of the epidemic into just another hazard of contemporary life. I identified with Kurt Vonnegut when he wrote in Palm Sunday of the firebombing of Dresden, which he survived and later chronicled in Slaughterhouse-Five: “One way or another, I got two or three dollars for every person killed.” These words haunted me in the mid-nineties. But unlike my peers, most of whom stopped writing about AIDS around that time (the Lambdas did away with a distinct prize for AIDS writing in 1992, the year before my first novel came out), I couldn’t find a way not to write about the epidemic. And though I would have liked to claim that I’d found the right way to write about AIDS, I was far from confident that that was the case, and had to settle for telling myself that I was at least looking for the right way. Among other directives, I took to heart a passage from “Is the Rectum a Grave?”:
At the very least, such things as the Justice Department’s near recommendation that people with AIDS be thrown out of their jobs suggest that if Edwin Meese would not hold a gun to the head of a man with AIDS, he might not find the murder of a gay man with AIDS (or without AIDS?) intolerable or unbearable. And this is precisely what can be said of millions of fine Germans who never participated in the murder of Jews (and of homosexuals), but who failed to find the idea of the Holocaust unbearable. That was the more than sufficient measure of their collaboration, the message they sent to their Führer even before the Holocaust began but when the idea of it was around, was, as it were, being tested for acceptability during the ’30s by less violent but nonetheless virulent manifestations of anti-Semitism.
If I could not achieve the Brechtian ideal—if my emotional investment in the subject would not allow me to abandon the cathartic mode—then I would aim for the Bersanian. I would try to craft a narrative in which the reader could take no consolation in the depictions of AIDS and the people it afflicted, would instead be forced to find the existence of the epidemic unbearable. I’m not sure if I overestimated my ability or underestimated the mental torpor of the bourgeois book-buyer, but in either case neither I nor my audience was able to foresee the development of combination therapy in 1996, and the almost instantaneous transformation of the public conception of AIDS from a death sentence into a manageable chronic illness.
But that was still a year away. In 1995, discussions about AIDS retained the dubious honor of being teleological. All were elaborations of a single fact, and that fact was this: there was no cure. Of course, there’s still no cure—no cure, no vaccine, no way of getting the drugs we do have to millions of people, both here and in countries that are unable to pay for, distribute, or supervise their use, and no way to tell how long they’ll continue to work for the people who do have access to them. In this regard AIDS has become a conversation with a Yeatsian center, which is perhaps the only way people can continue to talk about it without giving in to despair, or murderous rage. But lest we forget: the real—the only—solution to the AIDS epidemic will not be imaginative but, rather, scientific and bureaucratic. It will be a cure, and, just as important, the means to get that cure into the bodies of people infected with HIV. But until we have that cure, it is the imagination that must provide us with ways of facing this disease. From the time I started writing I wanted to be a part of this imaginative succor. I thought my first novel was about AIDS, but later came to realize that the hopelessness and grief and search for identity at the core of that book are not so much the hopelessness and grief we feel when confronted by AIDS as much as the hopelessness we feel when we first comprehend the inevitable fact of death—and those two things, death and AIDS, aren’t the same thing—not now, in 2013, nor even in the late eighties and early nineties when I was writing Martin and John. It wasn’t until my book was out in the world, however, and I found myself surprised by some of the things said about it (and me), that I understood I had displaced my feelings about my mother’s death onto the epidemic, had done the thing I least wanted to do, which was to strip AIDS of its ontological status and make it a symbol, a metaphor—“a universal story about love and loss and the redemptive powers of fiction,” as Michiko Kakutani was kind enough to say in the Times. I displaced my anger at myself onto her too, for pointing out what I saw as an artistic and moral failing.
It’s a lesson every writer has to learn when he or she crosses the bridge into publication, but I was unprepared for it, which is one of the reasons why I tried my hand at journalism in 1993 and ’94 and ’95, and also why I pretty much abandoned it after 1996, by which point I had learned that the journalist or memoirist finds it even harder than the novelist to prevent real people and events from being turned—reduced—to symbol. I took Janet Malcolm at face value when she declared, “Every journalist who is not too stupid or too full of himself to notice what is going on knows that what he does is morally indefensible”—but only after I’d published half a dozen fulsome essays and articles of my own. Nonfiction presents a sliver as the whole. It offers the proverbial slice of life, but while the writer’s focus is on the slice, the reader’s is on life, which, after all, admits neither ellipsis nor abbreviation. Existence is indivisible. It is either whole or it is false. In order to represent it, then, a writer must capitulate to this synecdochal fabrication. Some do so consciously, others blindly, but in either case even the most sophisticated of readers tends to be unaware of or unconcerned with what he considers a problem of classroom aesthetics or navel-gazing postmodernism—until, that is, he sees himself depicted in someone else’s words, in which case his reaction is almost always “That’s not me,” which, under interrogation, is usually fine-tuned to “That’s not all of me.” Hence the familiar expressions of outrage from one community or another when it sees a depiction of one of its own that it (or, more accurately, some of its members) dislikes. These depictions are always said to be “not representative,” not “the real story” or not “the whole story,” and their dissemination to the “general public” is said to be “unfair” or “biased.” And they’re right, of course, at least as far as bias goes (if only in the most literal sense of the word), though I often think their energy would be better spent educating themselves about how representation works rather than clamoring after the Sisyphean goal of a genuinely mimetic journalism or memoir (by which is usually meant a whitewashed or politically correct version of reality). Because Malcolm’s admonition to writers targets only half the problem. The other half is the reader, whose conception of the narrative enterprise tends to be even more unsophisticated than that of the average writer. But then, that’s not the reader’s problem, is it? It’s the writer’s.
11
Tony Kushner, no stranger to Brechtian dialectics, is fighting the same cathartic entropy I mentioned earlier when, at the end of Angels in America, Prior shoos the audience from the theater with the words, “The Great Work Begins.” But of course the great work
had already begun when Perestroika, the second part of Angels, premiered in 1992 (in London), and ’93 (New York). The work had begun in the fifties, sixties, and seventies when gay men created a subculture in the shadows and on the margins of the straight world, a limited but libidinous demimonde within whose borders most gay men—which is to say, the mostly white and mostly middle-class gay men who managed to find their way there—were perfectly comfortable. Certainly there was much to love about that world, and much to lament as well, although the latter had more to do with the ways in which the gay minority mimicked the hetero majority’s hierarchies of race, class, beauty, etc., rather than its own (real or enviously imagined) adolescent excesses.
But regardless of its strengths and failures, this world’s continued existence was made untenable by the outbreak of AIDS in 1981, which dragged gay men into the spotlight. No, that’s not quite right: AIDS gave gay men no choice but to step into the spotlight or die in the wings; and in the late eighties, when the gay community recovered its strength and its voice, if not its physical health, the work of building a new culture began. It was, at least marginally, a more diverse group this time around, in terms of race and gender and indeed sexual identity, a time when the word “gay” in many organizations’ names was replaced (although often only nominally) with “gay and lesbian,” then “gay, lesbian, and bisexual,” then “gay, lesbian, bisexual, and transgender,” then “gay, lesbian, bisexual, transgender, and intersex,” an all-inclusive mouthful that was increasingly, in everyday speech if not official communications, replaced with “queer” (or with “gay,” although now “gay” was supposed to mean “gay, lesbian, bisexual, transgender, intersex,” and whatever new or as-yet undeclared sexual identity might reveal itself next). But no matter what they called themselves, late eighties’ queers weren’t content with second-class status. Whether they would rattle their cups against the gates of the heterosexual palace and demand, in Bruce Bawer’s puling phrase, “a place at the table,” or carve out a separate but equal sphere was still up for grabs. For a hot minute, in fact, it looked like the Liberian vision would win out, and I’ve often wondered where we would have ended up if combination therapy hadn’t come along, not to mention the tech boom, which lured both the left and right into throwing over their ideals for the sake of easy money. But if, physically, the improvement was immediately apparent, the cultural gains were measured in smaller increments. In, say, a shift in vocabulary: in 1991, for example, when Millennium Approaches premiered and gay men were still railing at newspapers and magazines for not acknowledging the true status of same-sex relationships (c.f., Craig Lucas and Norman René’s Longtime Companion or Allen Barnett’s “The Times As It Knows Us”), the preferred term for two men in a longterm relationship was still “lovers.” But by the time Angels made it to HBO in 2003 “lover” was giving way to “husband,” which, though tinged with irony, was nevertheless a harbinger of the near-total emphasis on marriage equality that would take over the gay rights movement after the war against AIDS had been “won.” Then, too, there was the question of where you used these words, the government offices or job interviews in which “lover” or “husband” became “friend” or “roommate” or “that guy with me,” or when you just bit your tongue. In the fifties and sixties you made the change out of shame, in the seventies and eighties out of fear, in the nineties out of prudence; but by the time the millennium rolled around you didn’t make it at all, and, what’s more, hardly remembered doing it in the past.