Book Read Free

The Formula_How Algorithms Solve All Our Problems... and Create More

Page 5

by Luke Dormehl


  Here it is worth turning once more to the work of Alvin Toffler, whose concept of “demassification” laid out many of the principles described in this chapter. In The Third Wave, Toffler questions why it is that everyone should be asked to start work at 9 A.M. and finish at 5 P.M. each day. By changing this massified approach to a more individual one, centered around the self, Toffler argues that both employers and employees would experience benefits. The former group could use insights about the times their paid employees are at their most productive to maximize efficiency. Employees, meanwhile, could arrange their working hours around their other nonwork duties, or simply their natural biological rhythms (which we now know can be ascertained through wearable sensors). Of course, what sounded a utopian formula to Toffler now exists as the commonplace “flexitime” approach to employment, in which many companies have laid off their permanent workforce in favor of a free-floating pool of part-time, fixed-term and freelance workers lacking in benefits and job security. In Becky Hogge’s Barefoot into Cyberspace, the author relates this directly to the dream of the techno-solutionists, noting how “the eighties personal computer gurus are . . . the same folk who went around major corporations advising them on ways to decouple their fortunes from those of their employees, ushering in the era of precarious employment that is my generation’s norm.”29 In the gamified collapse of work into play and play into work, concepts like performance-based pay (presented as another level of personalization) mean that even those jobs that do not immediately lend themselves to increased speed and efficiency can be subjected to The Formula.

  This neo-Taylorist dynamic becomes more apparent the further you look down the high-tech food chain. In Amazon’s warehouses, for example, product pickers (known as “fulfillment associates”) are issued handheld computers that transmit instructions to reveal where individual products can be picked up or dropped off. Because of the size of Amazon’s warehouses, a routing algorithm is used to work out the shortest possible journey from point to point. That is not all the handheld computers do, however. They also collect a constant, real-time stream of data that monitors how fast employees walk and complete individual orders, thus quantifying their productivity.30 Like the bomb-loaded bus in the movie Speed, workers must maintain a certain minimum speed, or else see their jobs go up in smoke. As with Henry Ford’s assembly lines, so too here does machinery determine the pace of work. A warehouse manager at Amazon has been quoted as describing workers as “sort of like a robot, but in human form.”31 An article for Fast Company paints a depressingly Orwellian picture:

  An Amazon fulfillment associate might have to walk as far as 15 miles in a single shift, endlessly looping back and forth between shelves in a warehouse the size of nine soccer fields. They do this in complete silence, except for the sound of their feet. The atmosphere is so quiet that workers can be fired for even talking to one another. And all the while, cardboard cutouts of happy Amazon workers look on, cartoon speech bubbles frozen above their heads: “This is the best job I ever had!”32

  Similar reports can be seen elsewhere. In Tesco warehouses in the UK, workers are made to wear arm-mounted electronic terminals so that managers can grade them on how hard they are working. Employees are allocated a certain amount of time to collect an order from a warehouse and then complete it. If they meet the target, they are awarded a 100 percent score, rising to 200 percent if the task is completed in double time. Conversely, scores fall dramatically in situations where the task takes longer than expected.33

  Decimated-Reality Aggregators

  Speaking in October 1944, during the rebuilding of the House of Commons, which had sustained heavy bombing damage during the Battle of Britain, former British prime minister Winston Churchill observed, “We shape our buildings; thereafter they shape us.”34 A similar sentiment might be said in the age of The Formula, in which users shape their online profiles, and from that point forward their online profiles begin to shape them—both in terms of what we see and, perhaps more crucially, what we don’t.

  Writing about a start-up called Nara, in the middle of 2013, I coined the phrase “decimated reality aggregators” to describe what the company was trying to do.35 Starting out as a restaurant recommender system by connecting together thousands of restaurants around the world, Nara’s ultimate goal was to become the recommender system for your life: drawing on what it knew about you from the restaurants you ate in, to suggest everything from hotels to clothes. Nara even incorporated the idea of upward mobility into its algorithm. Say, for example, you wanted to be a wine connoisseur two years down the line, but currently had no idea how to tell your Chardonnay from your Chianti. Plotting a path through a mass of aggregated user data, Nara could subtly poke and prod you to make sure that you ended up at a particular end point after a certain amount of time. If you trained Nara’s algorithms to recognize what you wanted, Nara’s algorithms could then train you to fit a certain desired mold. In this way, “decimated reality” was a way of getting away from the informational overload of the Internet. Users wouldn’t see more options than they could handle—they would see only what was deemed relevant.

  “The Internet has evolved into a transactional machine where we give our eyeballs and clicks, and the machine gives us back advertising and clutter,” says Nathan Wilson, chief technology officer at Nara Logics Inc. “I’m interested in trying to subvert all of that; removing the clutter and noise to create a more efficient way to help users gain access to things.” The problem, of course, is that in order to save you time by removing the “clutter” of the online world, Nara’s algorithms must make constant decisions on behalf of the user about what it is that they should and should not see.

  This effect is often called the “filter bubble.” In his book of the same title, Eli Pariser notes how two different users searching for the same thing using Google will receive very different sets of results.36 A liberal who types “BP” into his browser might get information about the April 2010 oil spill in the Gulf of Mexico, while a conservative typing the same two letters is more likely to receive investment information about the oil company. Similarly, algorithms are more likely to respond to a female search for “wagner” by directing the user toward sites about the composer “Richard Wagner,” while a male is taken as meaning “Wagner USA” paint supplies. As such, what is presented by search algorithms is not a formula designed to give ideologically untampered answers but precisely the opposite: search results that flatter our personal mythologies by reinforcing what we already “know” about particular issues, while also downgrading in importance the concerns that do not marry up to our existing worldview.

  Despite the apparent freedom of such an innovation (surely, personalization equals good?), it is not difficult to see the all-too apparent downside. Unlike the libertarian technologist’s pipe dream of a world that is free, flat and open to all voices, a key component of code and algorithmic culture is software’s task of sorting, classifying and creating hierarchies. Since so much of the revenue of companies like Google depends on the cognitive capital generated by users, this “software sorting” immediately does away with the idea that there is no such thing as a digital caste system. As with the “filter bubble,” it can be difficult to tell whether the endless distinctions made regarding geo-demographic profiles are helpful examples of mass customization or exclusionary examples of coded discrimination. Philosopher Félix Guattari imagined the city in which a person was free to leave their apartment, their street or their neighborhood thanks to an electronic security card that raised barriers at each intersection. However, while this card represents freedom, it also represents repression, since the technology that opens doors for us might just as easily keep them shut. Much the same might be said of the algorithm, which is directly and automatically responsible for providing social and geographical access to a number of goods, services and opportunities for individuals.

  The idea that a certain class of user can be willfully inconvenien
ced in favor of another more desirable class is a key part of market segmentation. At various times, UK supermarkets have investigated the possibility of charging a premium for food shopping at peak shopping hours, in an effort to deter the “cash-poor but time-rich” customers from negatively impacting upon the shopping experience of the more desirable “cash-rich but time-poor” professionals.37 Many airlines offer premium schemes that allow valuable business travelers to pay an extra surcharge to bypass certain border controls. Sorting processes divide passengers into groups of those enrolled on the scheme and those that are not. In the case of the former premium group, passengers are offered parking spaces close to the airport terminal and then allowed to pass through to dedicated members’ lounges with speed. In the case of the latter “cash-poor but time-rich” group, assigned parking spaces are typically a long distance from the airport terminal, and passengers are excluded from VIP lounges and forced to endure lengthy “check-in” times and security queues. A similar brand of thinking is behind schemes set up in cities such as Los Angeles, San Diego, Toronto, Melbourne and Tokyo, in which privately funded highways are built for use by affluent motorists, who access them by paying a premium fee. To keep these highways exclusive, algorithms are used to estimate the exact level of price per journey that is likely to deter enough drivers so as to guarantee free-flowing traffic, regardless of how bad congestion might be on the surrounding public highway system.38

  Squelching the Scavengers

  The digital world is not immune to these practices. In the past, Internet networking provider Cisco has referred to its less-than-premium consumers as a “scavenger class” for whom the company provides “less-than-best-effort services” to certain applications. At times of Internet congestion and slow download times, traffic can be (in Cisco’s words) “squelched to virtually nothing” for scavenger-class users, while more valued business users are provided with access to Cisco’s answer to free-moving, privately funded highways. Similar distinctions might be made elsewhere. A marketing brochure from communications company the Avaya Corporation similarly promises that its bespoke systems allow for algorithms to compare the numbers of individual call-center callers to a database, and then route calls through to agents as high-priority if the caller is in the top 5 percent of customers. In this scenario, when the agent picks up the call, they hear a whispered announcement that the caller they are speaking with is “Top 5.” Much the same technology could be put into practice in situations where algorithms are used to filter the personalities of individual callers. Callers might receive service according to personality types, so that more lucrative customers with an increased likelihood of losing patience with a service quicker could be ranked above personality types likely to procrastinate and ask lots of questions, while being unlikely to spend large amounts of money with a company in the future.

  Perhaps the most granular example of this “human software sorting” can be seen with the algorithm-driven idea of “differential pricing.” This is something a number of online stores, including Amazon, have already experimented with. Without the added gloss of marketing speak, differential pricing means that certain users can be charged more than others for the same product. In the abstract, the algorithms used for this are no different from those that predict that, since a person bought the Harry Potter and Twilight books, they might also enjoy the Hunger Games trilogy. It is presented as another level of personalization, in line with the website that remembers your name (and, more importantly, your credit-card details). In September 2012, Google was granted a patent for “Dynamic Pricing on Electronic Content,” which allows it to change the listed price of online materials, such as video and audio recordings, e-books and computer games, based upon whether its algorithms determine a user is more or less likely to purchase a particular item. The filed patent was accompanied by illustrations gleefully suggesting that certain users could be convinced to pay up to four times what others are charged for the exact same digital file.39 In other words, if Google’s algorithms “know” that you are susceptible to tween-fodder like Harry Potter and Twilight, based on what you have searched for in the past, it can ensure that you pay through the nose for The Hunger Games—while also enticing the person who has only ever demonstrated the slightest of interests in teenage wizards and sparkly vampires to buy the product by lowering the price to tempt them.

  Recent years have also seen a rise in so-called emotion sniffing algorithms, designed to predict a user’s emotional state based on their tone of voice, facial expression—or even browsing history. A study carried out by Microsoft Research analyzed people’s phone records, app usage and current location, and then used these metrics to predict their mood. According to Microsoft, the algorithm’s daily mood estimates start at 66 percent accuracy and gradually increase to 93 percent after a two-month training period.40 Since mood significantly influences consumer preferences, information like this could prove invaluable to marketers.41 Let’s say, for example, that your computer or smartphone determines that you’re likely to be feeling particularly vulnerable at any given moment. In such a scenario it might be possible to surreptitiously raise the price of products you are likely to be interested in since an algorithm designed to “sniff” your mood has determined that you’re statistically more likely to be susceptible to a purchase in this state.

  Free-market idealists might argue in favor of such an approach. In the same way that an upmarket restaurant charges more for the exact same bottle of beer that a person could buy for half that price somewhere else, so concepts like differential pricing could be used to ensure that a person pays the exact price he or she is willing to spend on a product—albeit with more scientific precision. However, while this is undoubtedly true, the more accurate analogy may be the restaurant whose waiters rifle through your belongings for previous receipts before deciding what they think you ought to be charged for your food and drink. Differential pricing could be used to even the playing field (everyone pays 1/250 of their weekly salary for a beer, for example). More likely, however, it could be used to do just the opposite: to raise the price of particular goods with the stated aim of marginalizing or driving away those less lucrative—and therefore less desirable—users. A high-fashion brand might, for instance, want to avoid becoming associated with a supposed “undesirable” consumer base, as happened to Burberry in the early 2000s. Since it cannot outright bar particular groups from purchasing its products, the company could pay to have its products disappear from the algorithmic recommendations given to those who earned under $50,000 per year or—in the event that its products were specifically searched for—priced out of their range.

  The real problem with this is the invisibility of the process. In the same way that all we see are the end results when algorithms select the personalized banners that appear on websites we browse, or determine the suggested films recommended to us on Netflix, so with differential pricing are customers not informed that they are being asked to pay more money than their next-door neighbor. After all, who would continue shopping if this were the case? As Joseph Turow, a professor at the University of Pennsylvania’s Annenberg School for Communication and frequent writer about all things marketing, has pointed out in an article that appeared in the New York Times: “The flow of data about us is so surreptitious and so complex that we won’t even know when price discrimination starts. We’ll just get different prices, different news, different entertainment.”42

  The Discrimination Formula?

  A number of Internet theorists have argued that in the digital world, previous classifications used for discrimination (including race, gender or sexuality) will fall away—if they haven’t already. Alvin Toffler’s Third Wave identifies a number of individuals and groups subtly or openly discriminated against during the last centuries and argues that this marginalization is the product of Second Wave societies. According to Toffler, in the unbundled, personality-driven Third Wave society such discriminatory practices will slink off into t
he digital ether. This is a popular utopian view. In Mark Hansen’s essay “Digitizing the Racialized Body, or The Politics of Common Impropriety,” the author builds on this point by suggesting that the web makes possible an unprecedented number of opportunities for ethical encounters between people of different races, since race as a visual signifier is rendered invisible.43 The mistake made by both Toffler and Hansen is assuming that discrimination is always collectivist in nature. The Formula suggests that this is far from the case. In an age in which power no longer has to be embodied within a set structure and can be both codified and free-floating, discrimination is able to become even more granular, with categories such as race and gender made increasingly nonstatic and fluid. As can be seen with Internet-dating profiles, which I describe in more detail in Chapter 2, in order for algorithmic sorting to take place, individuals must first be subjected to the process of segmentation, where they are divided up into their individual components for reasons of analytics. Instead of being “individuals,” they are turned into “dividuals.”

  The concept of “dividuals” is not mine. The French philosopher Gilles Deleuze coined this phrase to describe physically embodied human beings who are nonetheless endlessly divided and reduced to data representations using tools such as algorithms. The Formula, Deleuze and his coauthor Félix Guattari argue in A Thousand Plateaus, has turned man into a “segmentary animal.”

  We are segmented in a binary fashion, following the great major dualist oppositions: social classes, but also men-women, adults-children, and so on. We are segmented in a circular fashion, in ever larger circles, ever wider disks or coronas, like Joyce’s “letter”: my affairs, my neighborhood’s affairs, my city’s, my country’s, the world’s . . . We are segmented in a linear fashion, along a straight line or a number of straight lines, of which each segment represents an episode or “proceeding”: as soon as we finish one proceeding we begin another, forever proceduring or procedured, in the family, in the school, in the army, on the job.44

 

‹ Prev