This Is Running for Your Life

Home > Other > This Is Running for Your Life > Page 26
This Is Running for Your Life Page 26

by Michelle Orange


  Yet there is no way of knowing whether the young man had a subjective experience of that response, since scans have also shown that the unconscious brain is capable of responding to stimuli even when what we think of as the conscious mind registers nothing. One term for this phenomenon is blindsight. If patients who have sustained ocular trauma say they cannot see the photo of the smiling face being held in front of them but their unconscious brains say they can, are both responses true?

  The legal and ethical debates over such questions will shape a larger discourse. In the future of consciousness, it would seem, subjectivity is somehow both paramount and beside the point. The protection of human life is ostensibly behind all medical advances, yet with neuroscience in particular the terms of what it means to be human are blurred. The question of quality comes into play more urgently in discussions of what it means to be alive, or to be capable of living the kind of life we think of as “good.” Again, the concept of “bad” is defined by a kind of default. Only the scientific path is clear: it is always better to know more, to pursue the science to its ends and then treat the human conundrums that result as inevitable.

  It’s no wonder we have started pair-bonding with our iPhones. In device attachment resides the old struggle between the possessor and the possessed, the shifting sands of desire and consent. What we respond to is not the gadget itself but its promise of some personal and highly specific gratification. Yet love must find its object. The image of a phone shivering alone on a table is no more or less loaded than a pinup of Brigitte Bardot. The image of a human being strapped into a three-ton magnet to watch a vignette of a smart phone shivering alone on a table, however, is a dystopian icon of heartbreak.

  * * *

  MindSign studies offer clients something that eludes even the most elaborate analog focus group: reliability. It’s pitifully easy, however, to outperform a system based on consulting college kids—“drunks and stoners,” in Carlsen’s words—for their thoughts on a $250 million movie. Carlsen cites his own time making the focus-group rounds while he was a film student as proof of their uselessness. After watching an early cut of a 2004 Dennis Quaid film called Flight of the Phoenix, Carlsen expedited the questionnaire like a man with better ways to waste his time.

  “I literally was like ‘Good, good, good, good, good, good, good, good, good, good, good,’” he said. “I want my ten bucks, I’m done.” Other data-occluders include our ovine tendency to defer to the loudest person in the room, and the inclination to tell a sympathetic or otherwise great-looking questioner what he wants to hear. As well as doing away with the obstructive veils of subjectivity, MindSign cuts the average focus group down from between fifty and one hundred subjects to the magic number of sixteen. “That’s the number for statistics, as far as fMRI science goes,” Carlsen said. “Sixteen is enough to speak for the population in a specific demographic.”

  Calming investors is a major selling point of this kind of science; calming investors has probably always been a kind of science. But when I point out that creating a better focus group bypasses the question of why the film industry began outsourcing their creative autonomy to average civilians in the first place, Carlsen is sanguine. He’s seen too much to worry about furthering a process that began shortly after he was born.

  “I just came from Hollywood,” he said, in response to a question about the protests of writers and directors against his line of work. “You think those guys have any power? I worked for four producers [including Quentin Tarantino’s longtime producer, Lawrence Bender]. I never wanted to be a producer, I just did it because I didn’t have any money and I needed a job.” Carlsen sat in on the meetings we all now know from the meta-genre of “backstage” movies and TV shows, where talent is massaged in the room and spit on the second it leaves, major decisions hinge on a coked-out executive’s fuckability call, and assistants spend their days making twenty copies of the latest piece of shit movie because it tested well and the boss is excited about it again. “It’s like that everywhere,” Carlsen said. “It’s just how it is. But great art will demand—” He paused. “Great art is always going to come through.”

  This, certainly, is the hope. The studio system, for its flaws and foibles, was a closed house, the proverbial dream factory from which all Hollywood movies flowed. The independent renaissance of the 1970s privileged individual directors over a tightly run system. Film was ordained as an art form and filmmakers as artists, complete with their infernal talk of vision. Then, as film journalist Mark Harris has suggested, the monster success of Top Gun in 1986 solidified a producer- and marketing-driven generation of “high concept” blockbusters, “pure product” films interested chiefly in “the transient heightening of sensation.” The 1990s hosted a resurgence in independent filmmaking, a flare of innovation whose ashes were sifted through for the next decade, mostly by studio boutiques hoping the residue of “quality” might rub off. What prevailed in the mainstream were tent-pole pictures based on established brands, endless sequels to fluky original hits, and children’s movies.

  Developing a movie the way you would a brand seems like the safest option in a time when audiences are fragmenting and piracy is making it harder with every passing evening to get people out of their living rooms. Even the toughest-minded forecasts from previous decades—all that high/low hand-wringing—read a little quaint today. Think of Pauline Kael forewarning of a future in which the good old popcorn movie disappears up Antonioni’s backside in an anomic ten-minute shot. Or Joan Didion arching precisely 3.5 millimeters of her brow over the empire that produced John Wayne giving way to an age of masturbatory passion projects. And that was the sixties. As a mass, middlebrow culture loomed, critics were preoccupied with segregating taste levels to their respective cultural water fountains. Eventually a policy of separate but equal arrived, and soon after that miscegenation ruled. In recent years the folding chair of creative power—having been passed from the studios to the independent auteur to the superproducer—ascended into the response-card cloud. Exit interviews, test screenings, and now brain-imaging scans help shape a film. It would seem that the audience has never been more powerful: united we stand, loving Bridesmaids one night and Nuri Bilge Ceylan the next.

  The phrase lowest common denominator, often applied to questions of taste, is actually a matter of biology. Our most basic human responses are basic because they are shared: hunger, sex, sleep, fight or flight, pain—the pain response is so old it’s vertebraic. If you gather enough people together and question them along a broad enough line, their responses will boil down to the consistency of a thick, primordial ooze. No one is above an ooze-based movie; it’s designed to stimulate everyone with a working spinal cord. But what’s being stimulated is a body, not a mind or spirit or even a brain. The latter is required for the punching of sensory buttons, but the human housing those responses is not. Denuding those senses—through deprivation or overstimulation—is a quick way to dehumanize. As our digestive tracts learn to break down frequently eaten foods more efficiently and develop resistance to foods we avoid, the psychologist Jonathan Schooler has suggested that the threshold for sensory stimulation adapts by a process called cosmic habituation. Where that threshold might top out, nobody knows. But it might be worth a guess.

  * * *

  Anyone standing downriver of the mainstream cannot escape this stuff. After several years of being expelled into the evening with the stink of a bad movie on me, a hierarchy of badness emerged. The bottom two-thirds of shit mountain is just the law of averages in digestive action, the same crap there has always been and ever will be. Only from the pinnacle does the horizon’s bleakness come into view.

  I remember my first glimpse from the top of Shit Mountain well. It was 2007, and I was watching the third installment of The Pirates of the Caribbean, the three-billion-dollar Disney franchise based on an amusement-park ride. It is the first time I can remember feeling that a movie was providing a direct and seriously unflattering reflection of what its
makers thought of me. Savvy advertising is always trying to tell you something about yourself; it traffics only in different, better, more fulfilled versions of you. That’s why it’s so miserably effective: an ad can adopt the stance of leading you toward your own best interests. But a brand-centric movie is stuck pretending its purpose is to entertain, even if its job was done the moment it got you through the door, $13.50 lighter. And that’s where movies like Pirates get caught out—it’s the ad and the product, a long commercial for itself with nothing to actually sell.

  In my review I used a word that critics have adopted as shorthand for the sins of the modern theme movie: cynical. More than pandering or mercenary laziness, these often wildly profitable films give off the chill of the inhuman. Things we associate with a story line occasionally occur, yet no story takes shape. Action set pieces prime us for a climax, yet we stagger out feeling gypped. Actors who resemble human beings speak in understandable sentences, yet nothing they say sinks in. Actually, my visit to MindSign made more sense of films like Cowboys & Aliens and I Am Number Four—the latter a non sequitur genre amalgam with high activation potential and low everything else—than a third or fourth viewing could have. Here was an explanation for the persistent, peculiar feeling that much of what I watch has issued not from flesh and blood but a floating, flashing brain.

  I tried to talk to Carlsen about how bad it is out there, as if he didn’t know. He assured me that science can help. He foresees neurocinema decamping from the fear- and threat-assessment responses of the amygdala to resettle its interests around Brodmann Area 25, the section of the ventromedial prefrontal cortex thought to mediate between the external world and the working memory to produce a sense of self. It’s the spot they see lighting up most often during what Carlsen calls “heartfelt” scenes. “In our opinion,” he said, “that’s the coolest area right now.”

  I mean, it sounds cool. But I’m not sure it can stop what has been started or outperform the lambent gold mine of the twelve- to twenty-five-year-old male amygdala. We are now weaning a generation that has assimilated activation as entertainment, to the dismay of at least one subway-riding dad recently overheard complaining to a colleague about the way that the Nickelodeon network’s signature rapid cutting and jarring sounds and colors transform his seven-year-old daughter into one of those new breeds of zombie who can book it like a black bear. This is, of course, a very circular concern: our parents told us television would rot our brains, but the age of iCarly and Hannah Montana makes Sesame Street look like The Mark Twain Children’s Hour.

  “I think that’s why kids are kind of … awful today,” said Carlsen, who wound up in the children’s television arm at DreamWorks. “You know? They don’t have good TV.”

  Evaluating the classics against this new metric produces scattered results. A 2008 study conducted by Israeli neurobiologists Uri Hasson and Rafi Malach found that an episode of Alfred Hitchcock Presents produced nearly identical patterns of brain response across a group of viewers, results that could only please a director who conceived of audience response as a form of reflexology. But Hitchcock was also profoundly interested in representing subjectivity; few directors made as plain the idea that we go to the movies to literally see how things feel. There may be no better statement on subjectivity vis-à-vis the movies than the one presented in Rear Window. The attempt to grab each viewer individually gives art to what would otherwise be an exercise in collective muscle control, something achievable by a lone tripod trained on a Wimbledon match.

  Hasson and Malach’s studies helped spark Carlsen and Hubbard’s interest in neurocinema. They conducted a study of their own involving the original, 1968 version of Night of the Living Dead.

  “Boooring,” Carlsen said. “Unfortunately. It was very nonactivating. And, like, that movie—it scares the hell out of me.” I wondered how he felt about helping create a standard that contradicts his own experience of what makes something good or sweet or scary. “Culture is different now,” Carlsen shrugged. “Often the people we scan are young college kids, and one could argue—I don’t know if this is actual fact—but one could argue that it’s just a different culture. They’re attracted by colorful things, pretty things, not as … old.” A movie like The Kids Are All Right, which Carlsen cited as an example of a recent film he loved—“it had me just as engaged as any Harry Potter movie or anything else”—will never activate the way a brain-pummeler like Battle: Los Angeles does.

  I mentioned Battle: Los Angeles—concept: aliens versus marines—to Carlsen because it had recently jackhammered a spot in the forefront of my memory of violently unpleasant viewing experiences. It’s the only film whose equilibrium-scrambling combination of epileptic camerawork and dysphasic, microsecond shots has actually made me sick. I was still nauseous almost forty-eight hours later, when I had to file my review. The judgment seemed obvious; the real challenge of the new generation of concept-driven, high-activation films is figuring out enough of what you just saw to write an opinion. The tighter the elevator pitch, the more elusive the explanation of the resulting film.

  Renata Adler famously put a four-year warranty on the sensibility of any practicing critic; I think lingerie models have longer careers. I had been reviewing films for four and a half years the night I staggered back into the neon hippodrome of Times Square, my mood as black and bottomless as the divot in Aaron Eckhart’s chin. For me steady reviewing was something of a fluke—a way to earn a living on the way to wherever I was going. It felt like a miracle then; for the most part it still does. But in addition to experiencing the habituation that can turn even a great job into a bit of a drag, I have come to suspect that writing for a living is one of the trickiest things a writer can do. With all due relativity noted, it is tough out there for a second-string reviewer; you can go to some dark places watching Hoodwinked Too! or blowing another evening on a movie based on a comic book that was invented for the sole purpose of spawning a movie that could then make the illustrious claim of having been based on a comic book.

  And trust me, no one looks to the culture or nobility of the profession for sustenance: when the Internet is not whizzing on your reviews and calling for your commitment to a cultural reeducation camp, your own colleagues are pretending not to notice each other on the screening-room circuit, each time behaving as though they’ve arrived at a 6:00 p.m. showing of Battlefield Girth and are desperate for the lights to go down. Perhaps the only thing worse is when people start talking. The urge to concuss myself with a seat rest when I hear the adenoidal stringers behind me wondering if it’s “reductive” to hate a film they’ve only heard about—maybe that’s an overreaction. Or wanting to clink the tilted heads of the bald white men seated in front of me together as they segue from an appreciation of the timeliness of The Help to seditious whispers about that bald-white-man-disappointer President Obama (“I remember weeping with joy,” said the one bitterly. “Literally weeping”)—perhaps that is a disproportionate response.

  Where criticism first seemed a natural and happy extension of my love of movies and writing, lately I have started to feel that there’s something unnatural about habituating to—that is making a career of—a certain level of stimuli. I don’t mean just at the movies but at the movies especially, where studios seem less interested in engaging individuals than equalizing audience response. That I am almost shy to admit fearing for my senses suggests how loath we are of being dismissed ourselves, aware on some level that to resist the culture’s unchecked velocity is to fail.

  “Neurological data is just another thing,” Carlsen said, in response to a question about the impact neuromarketing will have on the creative gene pool. “It’s not changing movies. It’s just making better products for the consumer. And that could be anything—it could be a better movie trailer, a better movie, a better Oreo-cookie package.” I asked what he meant by “better,” wondering out loud about the nefarious possibilities of this kind of research, its capacity to exploit the parts of our viewing, wanting, bu
ying selves that we are not aware of. MindSign’s goal is to get the suits comfortable with scanning everything back to the pitch—to determine, say, if simply hearing the words aliens versus marines activates well.

  “I don’t know if Hollywood’s ready to bite onto that yet,” Carlsen admitted. “Somebody will, soon enough. That’s kind of the thing with Hollywood, and America: somebody’s gonna do it eventually.”

  * * *

  The machine was free until one, and Carlsen offered to scan me while I watched a couple of movie trailers. A new Pirates movie was on deck—number four or five, neither of us could remember—and Carlsen had cited it several times as the exact kind of film that could use their help. The trailers on hand were a little older, one for Red Riding Hood—a gothy teen “reimagining” of the children’s fairy tale designed to catch some tailwind action from the Twilight franchise’s flapping druid robes—and one for the aforementioned Battle: Los Angeles.

 

‹ Prev