You May Also Like

Home > Other > You May Also Like > Page 19
You May Also Like Page 19

by Tom Vanderbilt


  My favorite record store, when it still existed, did not simply lump records into “rock” or “jazz” but lovingly curated the most arcane categories: “freakbeat,” “acid folk,” “soft psych.” These were all probably indistinguishable to the average person but glaringly distinct to the store’s clientele. Thinking of some obscure record as part of a larger thing no doubt made me appreciate it more.

  When we do not like something, on the other hand, we tend to dismiss it quickly, with sweeping generality: “I don’t like Spanish food.” Not “I don’t really care for that rare variety of Valencian paella in which the rice is braised in oil.” Liking seems to require finer gradients on a hedonic scale than disliking, as if, once someone had decided he did not like something, he needed fewer ways to express that disliking, or it was not worth the mental energy.

  In an argument that virtually anticipates the Museum of Bad Art, Walton suggested that if we took a “tenth-rate” artwork and perceived it by “some far-fetched set of categories that someone might dream up,” it might begin to “appear to be first-rate, a masterpiece.” An “offending feature” might become a virtue, a cliché in one category might be fresh in another. As the music critic Simon Frith argues, when 1970s disco songs are said to sound the same, it is a negative feature. They are “formulaic.” When folk songs collected from a specific time and place are also said to “sound the same,” this time it is viewed as a good thing (for example, they display “collective roots”). Some of the MOBA’s work would not warrant a raised eyebrow at an “outsider art” show, or at least few would deign to call it “bad.” Sometimes people call the MOBA, Frank told me, and say that “something doesn’t belong here, it’s too good, I like it.” To which he replies, “I like it. If I didn’t like it, I wouldn’t collect it.”

  —

  The idea that one might take pleasure in the avowedly bad is not something that Hume (or any other notable philosopher of aesthetics) was prepared to deal with. We might have our own tastes, our judgment might first be clouded, but eventually good critics would come around to the truth of a work’s quality. If “irregular” art (poetry was Hume’s example) did please, it pleased not by “transgressions of rule or order, but in spite of these transgressions.”

  The work at the MOBA falls into the curious category of “camp.” This is viewing through quotation marks, as Susan Sontag put it, celebrating works—sometimes kitsch, sometimes not—that attempt “to do something extraordinary,” exhibiting a “seriousness that fails.” It is laughing with rather than laughing at. Camp, she wrote, “doesn’t argue that the good is bad, or the bad is good.” Instead, it provides a new set of standards: “It’s good because it’s awful.” This raises the question, per Hume, of whether there can be “good critics” of “bad art.” Frank’s gut-level criteria at the MOBA is that a work be interesting. Merely being bad is not interesting. It needs, to dust off an old category invoked by George Orwell (quoting G. K. Chesterton), to be “good bad.” This is harder than you might think. Frank rejects a lot of works that he says are trying to be “self-consciously silly.” Sontag warned of works that were “bad to the point of being laughable, but not bad to the point of being enjoyable.”

  Something like camp, which did not become familiar until the twentieth century, could only flourish, Sontag argued, in “affluent societies,” where we had grown bored with all the good taste on display and were looking for a new kind of high, a liberation from the endless worrying over what was good. As a curator described some works of the cult filmmaker Ed Wood (who directed the famously awful Plan 9 from Outer Space), the films “are rough and tumble and ugly, and if you can accept them on that level, then good or bad stops being a question.” Of course, like cultural “omnivorousness,” camp appreciation could also be a new way of proving one’s cultural authority, by knowing what good camp is—a “good taste of bad taste,” as Sontag describes.

  Camp is now only one of the many complicated ways we have of interacting with taste objects. Were David Hume to return and draft a new “Standard of Taste,” he would surely be flummoxed by the complex taxonomy that now prevails among our “contest of sentiment.” He would have to, for example, understand the difference that can exist between camp sensibility and the “ironic” consumption of a bad painting or television show. Irony is all about protective distance and derision; one watches “serious” (but bad) television for laughs. Camp too can be “frivolous about the serious,” as Sontag said, but it is a way of celebrating, of getting closer, to the failed work. Irony is an emotional dead end; you cannot ever love ironically.

  Hume, too, would have had to grapple with the notion of “hate-watching,” a term popularized by the television critic Emily Nussbaum to signify the act of watching a show that one actively dislikes: “Why would I go out of my way to watch a show that makes me so mad?” She must, “on some level,” have been enjoying it, even though it “was bad in a truly spectacular way.” Hate-watching may be the inverse of camp: not loving a work because it tried something grand and failed, but hating it because it did not try hard enough and seemed annoyingly, not charmingly, unaware that it was not good (or, per Orwell, “bad good”). Or, perhaps, as Stendhal hinted, the things that once seemed so hateful may come to signify love.

  Lastly, Hume might have wanted to spend some time with the concept of the guilty pleasure. This is a term he would have known, though not in today’s context. Samuel Johnson, writing in The Rambler in 1750, talks about a man who is rather self-satisfactorily “dwelling with delight upon a stratagem of successful fraud, a night of licentious riot, or an intrigue of guilty pleasure.” In Johnson’s day, of course, a guilty pleasure really was a guilty pleasure. It was not having a second piece of chocolate cake; it was visiting a brothel or some other actual transgression against established moral codes (the pleasure would wane, Johnson warned, but the guilt would not).

  Only in the last few decades, however, as a survey of Google Ngram reveals, has the phrase “guilty pleasure” gained any real currency. We use it now to primarily talk about two things (often particularly implicating women): consuming culture and food—here again that slippage of the word “taste”—that we know is not “good for us” but we like anyway. The composer Nicholas McGegan compares his love of Strauss’s waltzes to a high-cholesterol dessert, things he is not supposed to like: “I listen to it in secret much as one might eat a large portion of chocolate cake behind closed doors.”

  “Guilty pleasure” is a curious concept. There is a question of causality: Does the pleasure cause the guilt, or does it in fact stem from it? Would guilty pleasures be pleasurable without guilt? Or does the guilt come because we are not feeling any guilt for indulging in the pleasure?

  Indeed, if we truly felt remorseful in having indulged in the guilty pleasure, we would not speak of it. The act of saying it out loud declares we are merely tourists in this temporary departure lounge of bad taste. By labeling something a guilty pleasure, we give ourselves license to consume it. In one study, subjects were offered a piece of chocolate cake (apparently, the benchmark for guilty pleasures!). While they were preempted by feelings of anticipated guilt, most were no less likely to want to consume it. The only people who were put off the cake by the prospect of guilt were those who least wanted it in the first place. Merely triggering feelings of guilt, the researchers speculated, might forge a mental pathway to pleasure—as if we were primed to think that something which makes us feel guilty will also make us feel good. It probably will, after all, at least before the fact. As Samuel Johnson observed, “In futurity events and chances are yet floating at large, without apparent connexion with their causes, we therefore indulge the liberty of gratifying ourselves with a pleasing choice.”

  If we really felt bad about a book we had read and liked—some abominable, morally repugnant tract—we would feel shame, not guilt. The two words may be conflated in your mind, but psychologists have made a convincing case that they are distinct phenomena.

 
One proposed difference is that shame is a “pure affective state”; you know it when you feel it (or see it upon a person’s face). Guilt, on the other hand, is an “affective-cognitive hybrid”; one often has to think about why one feels guilty. Shame, it has been argued, more often indicts the self, while guilt indicts a particular act. One says you are a bad person; the other says you did a bad thing. Guilt offers the promise of atonement, hence “confessing” that we watched bad reality television. The only punishment will come from ourselves. To assuage guilt when we “transgress” against someone, the thinking goes, we may try to become more helpful, or we may in fact channel our negative feelings onto the victim (particularly if he is in an “out group”). This, I would argue, is what we do with “guilty pleasures”: We consume some bit of culture that we feel is beneath us and then label it trash (not ourselves!).

  The guilty pleasure is not only a “license to consume” but a signaling device: You and I both know this is beneath us, or, perhaps, we are above thinking it is beneath us. By calling it a guilty pleasure, we can assure ourselves (and others) this is actually the case. You would only call something a guilty pleasure in the presence of someone who would also find it a guilty pleasure. To the person who considers eating at a fast-food restaurant a relative luxury (as I did when I was young), you do not speak of eating there as a guilty pleasure. The whole construct of the guilty pleasure is oriented, culturally, downward. If the guy who every night watches mixed martial arts while eating atomic wings on his La-Z-Boy recliner (whether the Thomas Kinkade model or not) is tempted by a box at the Metropolitan Opera for the matinee of Rigoletto, he is probably not thinking of it as a guilty pleasure.

  But to argue the opposite, that guilty pleasures should not exist, smacks of condescension even as it flaunts its democratic inclusiveness. For to declare what should not be considered a guilty pleasure is as judgmental a move as to declare what should be. To go further and declare that nothing we eat, watch, listen to, or read should be done without a twinge that something better lies out there is the chauvinism of one who knows there is and has already been there. In a way, this catholic, nonjudgmental omnivorousness might be the new snobbery, and you can almost feel the mix of pity and raised-eyebrow disdain as someone asks, “Oh, you feel weird about liking this?” Even as the idea of overarching standards is dismissed, a thousand new ones are invented, and it is now not so much what you like or should like as why or how you like. That contested terrain between you and what you like, on the one hand, and what you (or others) should like, on the other, so vexatious to Hume, now resembles a hopelessly booby-trapped minefield in a DMZ of taste, that thing we try not to mention but say much about in our silences.

  * * *

  *1 An homage, apparently, to an image found at a Japanese fertility museum.

  *2 There were many other supporting players in this extravaganza of wrestling with aesthetics, of course, from Lord Shaftesbury to Edmund Burke to Nietzsche. But Kant’s and Hume’s theories have generated the most subsequent attention.

  *3 It depends, of course, on what side you were walking on prior to entering the gallery; people in the U.K. tend to walk more on the left, and so they turn more toward the left when entering galleries.

  *4 Curiously, the provenance of this work has recently been called into question, which raises an interesting philosophical quibble: Can pleasure for the inauthentic be thought of as an authentic pleasure?

  *5 The Darmstadt version became generally accepted as the real work, so score one for the aesthetic wisdom of crowds. It was purchased in early 2014 for seventy million dollars.

  *6 They are out there, given the estimate that he was presumed to be found in one in twenty American homes.

  CHAPTER 5

  WHY (AND HOW) TASTES CHANGE

  I liked liking things before they were cool before it was cool.

  —Joss Whedon

  WILL YOU STILL LOVE ME TOMORROW?

  In 1882, at a Christie’s auction in London, the record was set for the highest price ever paid (six thousand pounds) for a painting by a living artist. The work? The Babylonian Marriage Market, by Edwin Longsden Long. If the crickets are chirping as you ransack your memory, do not worry: Neither Long nor his monumental canvas is a household name today.

  But both certainly were in 1882, when Thomas Holloway, the famed English vendor of patently iffy patent medicines (Queen Victoria herself was said to have consumed Holloway’s Pills), paid his colossal sum. The painting, a large, dramatic, exquisitely detailed study of an ancient market where women without dowries were subsidized via the trade in more desirable brides (as described in a “far-fetched” tale by Herodotus), was a sensation, both popular and critical. No less an eminence than John Ruskin deemed it a “painting of great merit.”

  Despite its ancient subject, it was very much of its day. Its eroticized Orientalist imagery and none-too-subtle commentary on the pecuniary nature of marriage in contemporary London were Victorian catnip. As the magazine of the Royal Academy suggested, the painting had it all: “richness and archaeology, scenic drama and amusement, much beauty and some grotesque by-play, antique fact and modern innuendo.” It even spoke slyly to the ascendant—some said inflated—art market, roiled by men of money like Holloway. Indeed, the auctioneer in the picture was reputedly modeled on Christie’s own auctioneer, the would-be bride buyers a subtle play on art dealers.

  Long, a prolific and popular painter in his day, completed his blockbuster in 1875. The date stands out, for it was the year another noteworthy, though very different, art auction was held, at the Parisian house Drouot. The artists included Monet, Sisley, and Renoir. Instead of historical and painterly accuracy and big social themes, these works took on mundane subjects in a style one French critic—hardly out of touch with prevailing taste—described as what happens when a monkey “might have got hold of a box of paints.” Far from commanding record-setting sums, the prices realized at the auction, as the onetime Christie’s director Philip Hook notes, “were dispiritingly low.”

  We know how the story turned out. Long, while largely maintaining critical respect as an artist, gradually faded from view, while the much-scorned Impressionists, whose work could often not be given away, went on to become the equivalent of rock stars playing to sold-out arenas—the kinds of artists who are familiar to people who are not into art. Renoir’s La Loge, which sold for a then paltry 220 francs at the 1875 auction, sold for $14.8 million in 2008.

  What changed? Not the paintings themselves, but tastes: the way the paintings were seen, what they seemed to say, the rules they adhered to (or broke). Long’s painting, however much it tapped into a kind of Victorian zeitgeist, did not seem to speak as much to succeeding generations, nor did its scrupulous academic style seem to excite subsequent critics. Photography captured the ground of realism. The Impressionists, meanwhile, saw all their faults turn into virtues. Writes Hook, “The garish color started looking exciting; the lack of finish was increasingly perceived as an exhilarating freedom of brush stroke; and the banality of subject matter took on the reassurance of the everyday, a confirmation of the universality of bourgeois experience.”

  There is always the chance, however unlikely, that the story may yet change again, with Long and his fellow Victorians propelled to some new esteem, the Impressionists pushed into some dark dustbin. Hume was well aware of the volatile, market-like shifts in taste. “Authority or prejudice may give a temporary vogue to a bad poet or orator,” he wrote, “but his reputation will never be durable or general.” He had enduring faith in the test of time: “On the contrary, a real genius, the longer his works endure, and the more wide they are spread, the more sincere is the admiration he meets with.” Yet this hardly reassures. Are the Impressionists as qualitatively better than Long as the swing in valuation would imply, or is popular taste at work? And consider how many “lost masterpieces” have been revived. If they were so great to begin with, how did they go missing? Perhaps this only reaffirms Hume: that they fell
out of favor because of “temporary vogue,” but then, having been rediscovered and championed by one of his qualified critics, they are there, as good as ever, waiting to rejoin their deserved place in the canon.

  The point is that we have little guarantee that those things most celebrated and esteemed today will be celebrated and esteemed tomorrow. But why are tastes, such an anchor in our daily lives, also so fleeting?

  —

  If you had asked me, when I was ten, to forecast my life as a situated adult, I would probably have sketched out something like this: I would be driving a Trans Am, a Corvette, or some other muscle car. My house would boast a mammoth collection of pinball machines. I would sip sophisticated drinks (like Baileys Irish Cream), read Robert Ludlum novels, and blast Van Halen while sitting in an easy chair wearing sunglasses, Maxell-ad style. Now that I am at a point to actually be able to realize every one of these feverishly envisioned tastes, they hold zero interest (well, perhaps the pinball machines in a weak moment).

  It was not just that my ten-year-old self could not predict whom I would become but that I was incapable of imagining that my tastes could undergo such wholesale change. How could I know what I would want if I did not know who I would be? The psychologist George Loewenstein has called this “projection bias.” “People behave as if their future preferences will be more like their current preferences than they actually will be,” he writes, “as if they project their current preferences onto their future selves.”

 

‹ Prev