Algorithms of Oppression

Home > Nonfiction > Algorithms of Oppression > Page 3
Algorithms of Oppression Page 3

by Safiya Umoja Noble


  Chapter 4 is dedicated to critiquing the field of information studies and foregrounds how these issues of public information through classification projects on the web, such as commercial search, are old problems that we must solve as a scholarly field of researchers and practitioners. I offer a brief survey of how library classification projects undergird the invention of search engines such as Google and how our field is implicated in the algorithmic process of sorting and classifying information and records. In chapter 5, I discuss the future of knowledge in the public and reference the work of library and information professionals, in particular, as important to the development and cultivation of equitable classification systems, since these are the precursors to commercial search engines. This chapter is essential history for library and information professionals, who are less likely to be trained on the politics of cataloguing and classification bias in their professional training. Chapter 6 explores public policy and why we need regulation in our information environments, particularly as they are increasingly controlled by corporations.

  To conclude, I move the discussion beyond Google, to help readers think about the impact of algorithms on how people are represented in other seemingly benign business transactions. I look at the “colorblind” organizing logic of Yelp and how business owners are revolting due to loss of control over how they are represented and the impact of how the public finds them. Here, I share an interview with Kandis from New York,10 whose livelihood has been dramatically affected by public-policy changes such as the dismantling of affirmative action on college campuses, which have hurt her local Black-hair-care business in a prestigious college town. Her story brings to light the power that algorithms have on her everyday life and leaves us with more to think about in the ecosystem of algorithmic power. The book closes with a call to recognize the importance of how algorithms are shifting social relations in many ways—more ways than this book can cover—and should be regulated with more impactful public policy in the United States than we currently have. My hope is that this book will directly impact the many kinds of algorithmic decisions that can have devastating consequences for people who are already marginalized by institutional racism and sexism, including the 99% who own so little wealth in the United States that the alarming trend of social inequality is not likely to reverse without our active resistance and intervention. Electoral politics and financial markets are just two of many of these institutional wealth-consolidation projects that are heavily influenced by algorithms and artificial intelligence. We need to cause a shift in what we take for granted in our everyday use of digital media platforms.

  I consider my work a practical project, the goal of which is to eliminate social injustice and change the ways in which people are oppressed with the aid of allegedly neutral technologies. My intention in looking at these cases serves two purposes. First, we need interdisciplinary research and scholarship in information studies and library and information science that intersects with gender and women’s studies, Black/African American studies, media studies, and communications to better describe and understand how algorithmically driven platforms are situated in intersectional sociohistorical contexts and embedded within social relations. My hope is that this work will add to the voices of my many colleagues across several fields who are raising questions about the legitimacy and social consequences of algorithms and artificial intelligence. Second, now, more than ever, we need experts in the social sciences and digital humanities to engage in dialogue with activists and organizers, engineers, designers, information technologists, and public-policy makers before blunt artificial-intelligence decision making trumps nuanced human decision making. This means that we must look at how the outsourcing of information practices from the public sector facilitates privatization of what we previously thought of as the public domain11 and how corporate-controlled governments and companies subvert our ability to intervene in these practices.

  We have to ask what is lost, who is harmed, and what should be forgotten with the embrace of artificial intelligence in decision making. It is of no collective social benefit to organize information resources on the web through processes that solidify inequality and marginalization—on that point I am hopeful many people will agree.

  1

  A Society, Searching

  On October 21, 2013, the United Nations launched a campaign directed by the advertising agency Memac Ogilvy & Mather Dubai using “genuine Google searches” to bring attention to the sexist and discriminatory ways in which women are regarded and denied human rights. Christopher Hunt, art director of the campaign, said, “When we came across these searches, we were shocked by how negative they were and decided we had to do something with them.” Kareem Shuhaibar, a copywriter for the campaign, described on the United Nations website what the campaign was determined to show: “The ads are shocking because they show just how far we still have to go to achieve gender equality. They are a wake up call, and we hope that the message will travel far.”1 Over the mouths of various women of color were the autosuggestions that reflected the most popular searches that take place on Google Search. The Google Search autosuggestions featured a range of sexist ideas such as the following:

  • Women cannot: drive, be bishops, be trusted, speak in church

  • Women should not: have rights, vote, work, box

  • Women should: stay at home, be slaves, be in the kitchen, not speak in church

  • Women need to: be put in their places, know their place, be controlled, be disciplined

  While the campaign employed Google Search results to make a larger point about the status of public opinion toward women, it also served, perhaps unwittingly, to underscore the incredibly powerful nature of search engine results. The campaign suggests that search is a mirror of users’ beliefs and that society still holds a variety of sexist ideas about women. What I find troubling is that the campaign also reinforces the idea that it is not the search engine that is the problem but, rather, the users of search engines who are. It suggests that what is most popular is simply what rises to the top of the search pile. While serving as an important and disturbing critique of sexist attitudes, the campaign fails to implicate the algorithms or search engines that drive certain results to the top. This chapter moves the lens onto the search architecture itself in order to shed light on the many factors that keep sexist and racist ideas on the first page.

  Figure 1.1. Memac Ogilvy & Mather Dubai advertising campaign for the United Nations.

  One limitation of looking at the implications of search is that it is constantly evolving and shifting over time. This chapter captures aspects of commercial search at a particular moment—from 2009 to 2015—but surely by the time readers engage with it, it will be a historical rather than contemporary study. Nevertheless, the goal of such an exploration of why we get troublesome search results is to help us think about whether it truly makes sense to outsource all of our knowledge needs to commercial search engines, particularly at a time when the public is increasingly reliant on search engines in lieu of libraries, librarians, teachers, researchers, and other knowledge keepers and resources.

  What is even more crucial is an exploration of how people living as minority groups under the influence of a majority culture, such as people of color and sexual minorities in the United States, are often subject to the whims of the majority and other commercial influences such as advertising when trying to affect the kinds of results that search engines offer about them and their identities. If the majority rules in search engine results, then how might those who are in the minority ever be able to influence or control the way they are represented in a search engine? The same might be true of how men’s desires and usage of search is able to influence the values that surround women’s identities in search engines, as the Ogilvy campaign might suggest. For these reasons, a deeper exploration into the historical and social conditions that give rise to problematic search results is in order, since rarely are they questioned and most Internet users h
ave no idea how these ideas come to dominate search results on the first page of results in the first place.

  Google Search: Racism and Sexism at the Forefront

  My first encounter with racism in search came to me through an experience that pushed me, as a researcher, to explore the mechanisms—both technological and social—that could render the pornification of Black women a top search result, naturalizing Black women as sexual objects so effortlessly. This encounter was in 2009 when I was talking to a friend, André Brock at the University of Michigan, who causally mentioned one day, “You should see what happens when you Google ‘black girls.’” I did and was stunned. I assumed it to be an aberration that could potentially shift over time. I kept thinking about it. The second time came one spring morning in 2011, when I searched for activities to entertain my preteen stepdaughter and her cousins of similar age, all of whom had made a weekend visit to my home, ready for a day of hanging out that would inevitably include time on our laptops. In order to break them away from mindless TV watching and cellphone gazing, I wanted to engage them in conversations about what was important to them and on their mind, from their perspective as young women growing up in downstate Illinois, a predominantly conservative part of Middle America. I felt that there had to be some great resources for young people of color their age, if only I could locate them. I quickly turned to the computer I used for my research (I was pursuing doctoral studies at the time), but I did not let the group of girls gather around me just yet. I opened up Google to enter in search terms that would reflect their interests, demographics, and information needs, but I liked to prescreen and anticipate what could be found on the web, in order to prepare for what might be in store. What came back from that simple, seemingly innocuous search was again nothing short of shocking: with the girls just a few feet away giggling and snorting at their own jokes, I again retrieved a Google Search results page filled with porn when I looked for “black girls.” By then, I thought that my own search history and engagement with a lot of Black feminist texts, videos, and books on my laptop would have shifted the kinds of results I would get. It had not. In intending to help the girls search for information about themselves, I had almost inadvertently exposed them to one of the most graphic and overt illustrations of what the advertisers already thought about them: Black girls were still the fodder of porn sites, dehumanizing them as commodities, as products and as objects of sexual gratification. I closed the laptop and redirected our attention to fun things we might do, such as see a movie down the street. This best information, as listed by rank in the search results, was certainly not the best information for me or for the children I love. For whom, then, was this the best information, and who decides? What were the profit and other motives driving this information to the top of the results? How had the notion of neutrality in information ranking and retrieval gone so sideways as to be perhaps one of the worst examples of racist and sexist classification of Black women in the digital age yet remain so unexamined and without public critique? That moment, I began in earnest a series of research inquiries that are central to this book.

  Of course, upon reflection, I realized that I had been using the web and search tools long before the encounters I experienced just out of view of my young family members. It was just as troubling to realize that I had undoubtedly been confronted with the same type of results before but had learned, or been trained, to somehow become inured to it, to take it as a given that any search I might perform using keywords connected to my physical self and identity could return pornographic and otherwise disturbing results. Why was this the bargain into which I had tacitly entered with digital information tools? And who among us did not have to bargain in this way? As a Black woman growing up in the late twentieth century, I also knew that the presentation of Black women and girls that I discovered in my search results was not a new development of the digital age. I could see the connection between search results and tropes of African Americans that are as old and endemic to the United States as the history of the country itself. My background as a student and scholar of Black studies and Black history, combined with my doctoral studies in the political economy of digital information, aligned with my righteous indignation for Black girls everywhere. I searched on.

  Figure 1.2. First page of search results on keywords “black girls,” September 18, 2011.

  Figure 1.3. First page of image search results on keywords “black girls,” April 3, 2014.

  Figure 1.4. Google autosuggest results when searching the phrase “why are black people so,” January 25, 2013.

  Figure 1.5. Google autosuggest results when searching the phrase “why are black women so,” January 25, 2013.

  Figure 1.6. Google autosuggest results when searching the phrase “why are white women so,” January 25, 2013.

  Figure 1.7. Google Images results when searching the concept “beautiful” (did not include the word “women”), December 4, 2014.

  Figure 1.8. Google Images results when searching the concept “ugly” (did not include the word “women”), January 5, 2013.

  Figure 1.9. Google Images results when searching the phrase “professor style” while logged in as myself, September 15, 2015.

  What each of these searches represents are Google’s algorithmic conceptualizations of a variety of people and ideas. Whether looking for autosuggestions or answers to various questions or looking for notions about what is beautiful or what a professor may look like (which does not account for people who look like me who are part of the professoriate—so much for “personalization”), Google’s dominant narratives reflect the kinds of hegemonic frameworks and notions that are often resisted by women and people of color. Interrogating what advertising companies serve up as credible information must happen, rather than have a public instantly gratified with stereotypes in three-hundredths of a second or less.

  In reality, information monopolies such as Google have the ability to prioritize web search results on the basis of a variety of topics, such as promoting their own business interests over those of competitors or smaller companies that are less profitable advertising clients than larger multinational corporations are.2 In this case, the clicks of users, coupled with the commercial processes that allow paid advertising to be prioritized in search results, mean that representations of women are ranked on a search engine page in ways that underscore women’s historical and contemporary lack of status in society—a direct mapping of old media traditions into new media architecture. Problematic representations and biases in classifications are not new. Critical library and information science scholars have well documented the ways in which some groups are more vulnerable than others to misrepresentation and misclassification.3 They have conducted extensive and important critiques of library cataloging systems and information organization patterns that demonstrate how women, Black people, Asian Americans, Jewish people, or the Roma, as “the other,” have all suffered from the insults of misrepresentation and derision in the Library of Congress Subject Headings (LCSH) or through the Dewey Decimal System. At the same time, other scholars underscore the myriad ways that social values around race and gender are directly reflected in technology design.4 Their contributions have made it possible for me to think about the ways that race and gender are embedded in Google’s search engine and to have the courage to raise critiques of one of the most beloved and revered contemporary brands.

  Search happens in a highly commercial environment, and a variety of processes shape what can be found; these results are then normalized as believable and often presented as factual. The associate professor of sociology at Arizona State University and former president of the Association of Internet Researchers Alex Halavais points to the way that heavily used technological artifacts such as the search engine have become such a normative part of our experience with digital technology and computers that they socialize us into believing that these artifacts must therefore also provide access to credible, accurate information that is depoliticized and n
eutral:

  Those assumptions are dangerously flawed; . . . unpacking the black box of the search engine is something of interest not only to technologists and marketers, but to anyone who wants to understand how we make sense of a newly networked world. Search engines have come to play a central role in corralling and controlling the ever-growing sea of information that is available to us, and yet they are trusted more readily than they ought to be. They freely provide, it seems, a sorting of the wheat from the chaff, and answer our most profound and most trivial questions. They have become an object of faith.5

  Unlike the human-labor curation processes of the early Internet that led to the creation of online directories such as Lycos and Yahoo!, in the current Internet environment, information access has been left to the complex algorithms of machines to make selections and prioritize results for users. I agree with Halavais, and his is an important critique of search engines as a window into our own desires, which can have an impact on the values of society. Search is a symbiotic process that both informs and is informed in part by users. Halavais suggests that every user of a search engine should know how the system works, how information is collected, aggregated, and accessed. To achieve this vision, the public would have to have a high degree of computer programming literacy to engage deeply in the design and output of search.

  Alternatively, I draw an analogy that one need not know the mechanism of radio transmission or television spectrum or how to build a cathode ray tube in order to critique racist or sexist depictions in song lyrics played on the radio or shown in a film or television show. Without a doubt, the public is unaware and must have significantly more algorithmic literacy. Since all of the platforms I interrogate in this book are proprietary, even if we had algorithmic literacy, we still could not intervene in these private, corporate platforms.

 

‹ Prev