Book Read Free

The Formula_How Algorithms Solve All Our Problems... and Create More

Page 10

by Luke Dormehl


  Desiring Machines

  There is little disputing that Kari is an attractive girl. She has bright blue eyes, made all the more attention-grabbing by the smoky eye makeup she wears. Her hair is cut to shoulder length and is bottle blond in color, with just the slightest trace of what one presumes to be her natural brown creeping in at the roots. Her face suggests that she is in her early twenties, and as she looks toward the camera, her full lips part slightly as a coquettish smile plays over her features. It’s only then that you notice for the first time that she is wearing cotton candy– colored lip gloss.

  If Kari sounds a little, well, fake, that is entirely understandable. She is not real, at least not in the sense that you or I are—existing purely in the form of computer code. Her name is an acronym, standing for Knowledge Acquiring and Response Intelligence, and she is what is known in the trade as a “chatterbot,” an algorithm designed to simulate an intelligent conversation with an actual person. Kari (or, perhaps, K.A.R.I.) is the creation of Sergio Parada, a thirtysomething programmer living in West Palm Beach, Florida.

  Born into a military family in San Salvador, El Salvador, Parada moved to the United States with his parents when he was ten in order to escape the Salvadoran civil war. After studying video games and programming at Chicago’s Columbia College, Parada picked up a job working on the Leisure Suit Larry series of adult video games for the now-defunct developer Sierra Entertainment. It was while he was employed on the game tentatively titled Lust in Space (aka Larry Explores Uranus) that Parada came up with the idea of creating a piece of software centered entirely around one’s interactions with a solo female character. “It was something brand new,” he recalls, describing his aim as being “not just a girl simulation, but a relationship simulation.” Unlike Leisure Suit Larry, which followed an affable loser as he attempts to bed a procession of ludicrously attractive women, in Kari Virtual Girlfriend there would be no set narrative. This necessarily meant that in the traditional game-playing sense there was also no way to win. Like a real relationship, the game continues indefinitely, with the reward being a relationship that grows and deepens.

  In the same way that people anthropomorphize pets, attributing them with human characteristics to explain behavior, so too does Kari build on the willingness to protect, feed, shelter and guard machines that is second nature to a generation who grew up with Tamagotchis: the handheld digital “pets” that enjoyed a burst of popularity in the late 1990s. A number of psychology researchers (with two of the most prominent being Mihaly Csikszentmihalyi and Eugene Rochberg-Halton) have investigated the creation-of-meaning process that is called “psychic energy.” The idea is that as a person invests more psychic energy in an object, they attach more meaning to it, thus making the object more important to them, which makes their attachment grow all the stronger. This effect is only strengthened by the fact that in Kari’s case, she encourages the user to talk and proves an adept listener. This might suggest an additional reason for the magnetic pull that Kari wields over some users.

  In a 1991 study by Arthur Aron, a psychology professor at Stony Brook University in New York, pairs of students who had never previously met were placed in a room together for 90 minutes, during which time they exchanged intimate pieces of personal information, such as their likely reaction to the loss of a parent or their most embarrassing personal moments. At the end of the time period, the two were asked to stare unspeaking into one another’s eyes for a further two minutes. The students were asked to rate the closeness they felt at various intervals and, after just 45 minutes, these ratings outscored the closeness felt by 30 percent of other students, ranking the closest relationships they had experienced in their lives. Aron’s conclusion was that disclosure is therefore both fast-acting and powerful as a device to increase personal attraction. (As if to bear this theory out, despite the students exiting the room from different doors after the experiment was over, so as to remove any sense of obligation to stay in touch, the first pair to take part were married six months later.)

  One major difference in the relationship Kari has with her users, when compared to the vast majority of romantic relationships (aside from the obvious one), is that in order to avail oneself of her services, a person must first purchase her. The standard version of the program costs $24, while the “Kari Ultimate Package”—which comes with an Avatar Studio to “make your own girls”—will set users back $90.39

  For those who do buy her, the relationship conforms to what French philosopher Michel Foucault refers to as “spirals of power and pleasure.” The user’s pleasure is linked to the power they hold over their virtual girlfriend. It is a technological reimagining of the Pygmalion fantasy, in which Kari takes on the role of Eliza Doolittle: the naive, young, working-class girl initiated into the sexual world by the older, wealthier, more experienced Professor Henry Higgins.40 Of course, where Henry Higgins handcrafted his idealized companion by teaching her genteel manners and elocution, with Kari the user has the ability to fine-tune specific areas of her personality, such as her “love,” “ego” and “libido” ratings, which are measured on an adjustable scale from 0 to 10. Find that Kari skips from topic to topic when she talks? Just increase her “stay on topic”-ness, or else change the number of seconds between unprovoked comments. Worry that she is starting to act too aloof? Lower her “independence” and, if that doesn’t work, her “memory scope.” This can have an unfortunately detrimental effect on Kari’s AI, leading to one prickly reviewer complaining that “one minute she’s spouting some dime store philosophy, the next she’s asking to be ‘f***ed so hard,’” and describing the result as “the video game equivalent of dating a drunken first-year philosophy student with really low self-esteem.”41

  Ultimately, it is Kari’s malleability that forms the crux of her appeal. Unlike the other “flickering signifiers” we might fall in love with—be they film actor, pop star, or doe-eyed supermodel plastered across a billboard—Kari can adjust to appeal to each person individually; quite literally living up to the prostitute’s classic sales pitch, “I can be whoever you want me to be.”

  “Kari fills that hole we all have inside ourselves for a connection with someone,” Parada notes. “Some users write to me with word of the interesting projects they are conducting. One wanted to get her pregnant and see the nine-month cycle of having a baby with his Kari. Another wanted to give her menstrual cycles. Another fed his entire journals to it and said it was better than therapy. Whatever you’re up to, Kari is a big canvas to work with. It’s a canvas of awareness and thought; an extension of the self. Whatever we choose to expose our Karis to is what they will become. It’s like creating a being from the things we choose to teach it. The best part is that this is all integrated into an avatar that can talk back to you, be your best friend—and even love you.”42

  Love and Sex with Algorithms

  The Kari-esque dream of an algorithmic lover has existed at least as long as there have been personal computers, from 1984’s Electric Dreams (in which a San Francisco architect’s newly installed computer system grows jealous of its owner’s burgeoning relationship with an attractive neighbor) to Spike Jonze’s 2013 film, Her.

  In Japan, there exists a popular genre of video game known as the “dating sim.” The typical dating sim (short for “simulation”) allows the player to take control of an avatar, who is then confronted with a number of potential love interests whom they must choose between. As with role-playing games, dating sims rely heavily on statistics. A character’s conversation with a would-be mate is measured according to their choice of appropriate lines of dialogue, with overall scores improving or worsening depending upon which line is chosen at a particular time. In one such scene, a female character is depicted eating an ice cream, a daub of it having smeared her cheek. Players are given the option of “Wipe it off for her” or “Pretend I didn’t see it,” while an additional piece of information warns, “Her affection level will change accor
ding to your choices.”43

  Perhaps the most notable exponent of this game-space view of relationships is artificial intelligence scholar and chess master David Levy. In Levy’s most recent book, Love and Sex with Robots, he offers the prediction that not only will the concepts suggested in his title be possible in the future but that, by 2050, they will be routine. As Levy argues:

  One can reasonably expect that a robot will be better equipped than a human partner to satisfy the needs of its human, simply because a robot will be better at recognizing those needs, more knowledgeable about how to deal with them, and lacking any selfishness or inhibitions that might, in another human being, mitigate against a caring, loving approach to whatever gives rise to those needs.44

  In a moment of particularly questionable human understanding, Levy theorizes that an algorithm could analyze the brain’s “love measure” by way of an fMRI scanner and use this information to hone specific strategies to make a person fall in love. Like a game of chess, the path to true love would thus consist of a series of steps, each move either lowering or raising the love measure exhibited in the brain. In the manner of a smart central-heating thermostat, the aim would be to automatically adjust warmth or coolness so as to keep conditions at an optimal level. Suggestion that you watch a Tchaikovsky ballet together met with a frosty response? Try complimenting the other person’s new haircut instead. An algorithm could even take note of the low-level features we might be struck by but unable to verbalize in partners, such as the way the lover flicks their hair or lights a cigarette, and incorporate this into its seductive repertoire.

  Like Kari, Levy’s proposed automated lover could even answer the old philosopher’s question of how love demands that infinite desire be directed toward a finite object. Fancy a gawky, retiring lover one night and a “woman looking like Marilyn Monroe . . . with the brainpower and knowledge of a university professor and the conversational style of a party-loving teenager” the next? No problem. Rather than having to willfully idealize the human lover to make them unique, the ideal partner could be created, then constantly modified in the manner of a Facebook profile. Even the suggestion that we might be drawn to individuals with flaws doesn’t appear to faze Levy. On the contrary, if a “perfect” relationship “requires some imperfections of each partner to create occasional surprises . . . it might . . . prove necessary to program robots with a varying level of imperfection in order to maximize their owner’s relationship satisfaction,” he writes. Later on, he gives an example of how this might practically function, as he imagines a lover’s tiff in which the human party in the couple finally loses patience with their algorithm’s operating efficiency and yells, “I wish you weren’t always so goddamn calm.” To resolve this issue and restore stability to its optimal level, Levy suggests that here it might be necessary for the algorithmic partner to simply “reprogram itself to be slightly less emotionally stable.”

  It’s easy when you know how. Although one can’t help but think you risk losing a certain spark in the process.

  CHAPTER 3

  Do Algorithms Dream of Electric Laws?

  Adecade ago, Walmart stumbled upon an oddball piece of information while using its data-mining algorithms to comb through the mountains of information generated by its 245 million weekly customers. What it discovered was that, alongside the expected emergency supplies of duct tape, beer and bottled water, no product saw more of an increase in demand during severe weather warnings than strawberry Pop-Tarts. To test this insight, when news broke about the impending Hurricane Frances in 2004, Walmart bosses ordered trucks stocked with the Kellogg’s snack to be delivered to all its stores in the hurricane’s path. When these sold out just as quickly, Walmart bosses knew that they had gained a valuable glimpse into both consumer habits and the power of The Formula.1

  Walmart executives weren’t alone in seeing the value of this discovery. At the time, psychologist Colleen McCue and Los Angeles police chief Charlie Beck were collaborating on a paper for the law-enforcement magazine The Police Chief. They too seized upon Walmart’s revelation as a way of reimagining police work in a form that would be more predictive and less reactive. Entitled “Predictive Policing: What Can We Learn from Walmart and Amazon about Fighting Crime in a Recession?,” their 2009 paper immediately captured the imagination of law-enforcement professionals around the country when it was published.2 What McCue and Beck meant by “predictive policing” was that, thanks to advances in computing, crime data could now be gathered and analyzed in near-real time—and subsequently used to anticipate, prevent and respond more effectively to those crimes that would take place in the future. As per the slogan of Quantcast—the web-analytics company I described back in Chapter 1—it means that police could “Know Ahead. Act Before.”™

  Today, there is no man more associated with the field of predictive policing than Los Angeles Police Department’s Captain Sean Malinowski. Despite his reputation within the force as a “computer guy,” Malinowski’s background is actually in marketing. Prior to his joining the LAPD in 1994, he worked as a marketing account executive, helping chewing gum and margarine companies roll out new products by figuring out how best to hit customers with targeted coupons and special offers. Through a stroke of good fortune Malinowski ended up working on a drunk-driving campaign with several officers from the City of Detroit Police Department. He found working with police to be a revelation. “Up until that point I was all gung-ho on the marketing thing,” he says. “I hadn’t thought about the police force before that. Part of it was that I had reached a time in my marketing career where I was thinking, ‘Christ, is this it: my whole life’s mission is going to be to sell edible fats and oils?’ The cops I was working with had a real mission; they were trying to do something bigger.”

  Through some mutual friends, Malinowski was introduced to a former New York cop, who had recently moved to Chicago to work as an academic. After speaking with him, Malinowski quit the marketing business and went back to school. Several years later he graduated from the University of Illinois with a master of arts in criminal justice. Then it was on to the LAPD Training Division, where Malinowski wound up being elected class president. His first job with the police department proper was on Pacific Beach Detail, cycling endlessly up and down the Venice beachfront to interview street vendors and ensure public safety.

  Malinowski’s big break came when he was assigned to work for Chief William Bratton: first as his aide and later as his executive officer. Bratton had just moved to Los Angeles from New York City, where he’d established a formidable reputation as the cop who had halved the city’s murder rate in the span of a few years. Bratton’s methods, though effective, were undeniably offbeat. Prior to the New York Police Department, he had headed up the New York transit police, where he had transformed the New York subway system from a veritable hellhole into a clean, orderly microcosm of a law-abiding society by . . . cracking down on fare dodging. In other words, at a time when serious crimes on the subway were at an all-time high, Bratton focused his attention on making sure that people paid for their tickets. His reasons, as recalled in his 2009 memoir, were simple: fare evasion was a gateway drug to more serious crime. “Legitimate riders felt that they were entering a place of lawlessness and disorder,” he noted. “They saw people going in for free and began to question the wisdom of abiding by the law . . . The system was veering toward anarchy.”3 By stopping and searching law violators for even the most minor of infractions, troublemakers decided that it was easier to pay their fares and leave their weapons (which were often uncovered during searches) at home. Crime fell exponentially.

  Relocated to Los Angeles, Chief Bratton wanted to use some of that preemptive mojo on a grander scale. Working under him for five years, Malinowski witnessed firsthand how Bratton was able to take a department stymied by inertia and push through changes by sheer force of will. “When you’re in a bureaucratic organization, you get so used to the barriers coming up that it ca
n limit people’s creativity,” Malinowski says. “What Bratton instilled in me was not to be affected by all that. He taught me to think big and make things happen.”

  More than anything, Bratton was always on the lookout for the next “big idea” to revolutionize his work. In predictive analytics, he felt that he had found it. What Bratton had noticed was a correlation between crime rate and the speed at which that data could be analyzed. In 1990, crime data was collected and reviewed only on an annual basis. At the time, crime was on a steep rise in the majority of American cities. When 1995 rolled around, crime data could be looked at on a month-by-month basis. Crime rates during that same period slowed. With crime rates now viewable on a moment-to-moment basis, Bratton posited that this could lead to an actual drop in crime rates by predicting where crimes would next take place. In the same way that companies like Quantcast and Google are able to mine user data for insights, so too was the idea of predictive policing, that rather than simply identifying past crime patterns, analysts could focus on finding the next crime location within an existing pattern. To put it in Amazon terms: You stole a handbag; how about robbing a liquor store?

  Why Is Crime Like an Earthquake?

  It is widely accepted that crimes don’t occur randomly dispersed across a particular area, but rather that they exist in small geographical clusters, known as “hotspots.” In Seattle, for example, crime data gathered over a 14-year period shows that half of all crime can be isolated to just 4.5 percent of the city’s streets. A similar observation holds true for Minneapolis, where 3.3 percent of all street addresses generate more than 50 percent of calls to police. Over a 28-year stretch, a marginal 8 percent of streets in Boston accounted for a whopping 66 percent of street robberies. Knowing about these hotspots, and the type of crime that is likely to take place at them, can be vital in helping direct police to specific parts of a city.4

 

‹ Prev