Algorithms of Oppression

Home > Nonfiction > Algorithms of Oppression > Page 20
Algorithms of Oppression Page 20

by Safiya Umoja Noble


  Where Are Black Girls Now?

  Since I began the pilot study in 2010 and collected data through 2016, some things have changed. In 2012, I wrote an article for Bitch Magazine, which covers popular culture from a feminist perspective, after some convincing from my students that this topic is important to all people—not just Black women and girls. I argued that we all want access to credible information that does not foster racist or sexist views of one another. I cannot say that the article had any influence on Google in any definitive way, but I have continued to search for Black girls on a regular basis, at least once a month. Within about six weeks of the article hitting newsstands, I did another search for “black girls,” and I can report that Google had changed its algorithm to some degree about five months after that article was published. After years of featuring pornography as the primary representation of Black girls, Google made modifications to its algorithm, and the results as of the conclusion of this research can be seen if figure C.2.

  No doubt, as I speak around the world on this subject, audiences are often furiously doing searches from their smart phones, trying to reconcile these issues with the momentary results. Some days they are horrified, and other times, they are less concerned, because some popular and positive issue or organization has broken through the clutter and moved to a top position on the first page. Indeed, as this book was going into production, news exploded of biased information about the U.S. presidential election flourishing through Google and Facebook, which had significant consequences in the political arena.

  I encourage us all to take notice and to reconsider the affordances and the consequences of our hyperreliance on these technologies as they shift and take on more import over time. What we need now, more than ever, is public policy that advocates protections from the effects of unregulated and unethical artificial intelligence.

  Figure C.2. My last Google search on “black girls,” June 23, 2016.

  Epilogue

  Between the time I wrote this book and the day it went into production, the landscape of U.S. politics was radically altered with the presidential defeat on November 8, 2016, of former secretary of state Hillary Clinton by Donald Trump. Within days, media pundits and pollsters were trying to make sense of the upset, the surprise win by Trump, particularly since Clinton won the popular vote by close to three million votes.

  Immediately, there were claims that “fake news” circulating online was responsible for the outcome. Indeed, as I gave talks about this book in the weeks after the election, I could only note in my many public talks that “as I’ve argued for years about the harm toward women and girls through commercial information bias circulating through platforms like Google, no one has seemed to care until it threw a presidential election.” Notably, one remarkable story about disinformation (patently false information intended to deceive) made headlines about the election results.

  This new political landscape has dramatically altered the way we might think about public institutions being a major force in leveling the playing field of information that is curated in the public interest. And it will likely be the source of a future book that recontextualizes what information means in the new policy regime that ensues under the leadership of avowed White supremacists and disinformation experts who have entered the highest levels of public governance.

  Figure E.1. Google search for “final election results” leads to fake news. Source: Washington Post, November 14, 2016.

  Figure E.2. Google results on final election results incorrectly show Trump as the winner of the popular vote. Source: Washington Post, November 14, 2016.

  Figure E.3. Circulation of false information on Twitter shows Trump as the winner of the popular vote, November 14, 2016.

  Agencies that could have played a meaningful role in supporting research about the role of information and research in society, including the Institute for Museum and Library Services, the National Endowment for the Humanities, and the National Endowment for the Arts, are all under the threat of being permanently defunded and dismantled as of the moment this book goes into production. In fact, public research universities are also facing serious threats in cuts to federal funding because of their lack of compliance with the new administration’s policies. This has so radically altered the research landscape to the political right that scientists and researchers marched on Washington, D.C., on April 22, 2017, in response to orders that government-funded scientists and researchers stop conducting and disseminating research to the public. The potential for such a precedent may extend to public research universities, or at least many faculty members are working under the premise that this may not be out of the realm of possibility over the next four to eight years.

  In this book, I have argued that the neoliberal political and economic environment has profited tremendously from misinformation and mischaracterization of communities, with a range of consequences for the most disenfranchised and marginalized among us. I have also argued for increased nonprofit and public research funding to explore alternatives to commercial information platforms, which would have included support of noncommercial search engines that could serve the public and pay closer attention to the circulation of patently false or harmful information. In the current environment, I would be remiss if I did not acknowledge, on the eve of the publication of this book, that this may not be viable at all given the current policy environment that is unfolding.

  My hope is that the public will reclaim its institutions and direct our resources in service of a multiracial democracy. Now, more than ever, we need libraries, universities, schools, and information resources that will help bolster and further expand democracy for all, rather than shrink the landscape of participation along racial, religious, and gendered lines. Information circulates in cultural contexts of acceptability. It is not enough to simply want the most accurate and credible information to rise to the top of a search engine, but it is certainly an important step toward impacting the broader culture of information use that helps us make decisions about the distribution of resources among the most powerful and the most disenfranchised members of our society.

  In short, we must fight to suspend the circulation of racist and sexist material that is used to erode our civil and human rights. I hope this book provides some steps toward doing so.

  NOTES

  INTRODUCTION

  1. Matsakis, 2017.

  2. See Peterson, 2014.

  3. This term was coined by Eli Pariser in his book The Filter Bubble (2011).

  4. See Dewey, 2015.

  5. I use phrases such as “the N-word” or “n*gger” rather than explicitly using the spelling of a racial epithet in my scholarship. As a regular practice, I also do not cite or promote non–African American scholars or research that flagrantly uses the racial epithet in lieu of alternative phrasings.

  6. See Sweney, 2009.

  7. See Boyer, 2015; Craven, 2015.

  8. See Noble, 2014.

  9. The term “digital footprint,” often attributed to Nicholas Negroponte, refers to the online identity traces that are used by digital media platforms to understand the profile of a user. The online interactions are often tracked across a variety of hardware (e.g., mobile phones, computers, internet services) and platforms (e.g., Google’s Gmail, Facebook, and various social media) that are on the World Wide Web. Digital traces are often used in the data-mining process to profile users. A digital footprint can often include time, geographic location, and past search results and clicks that have been tracked through websites and advertisements, including cookies that are stored on a device or other hardware.

  10. “Kandis” is a pseudonym.

  11. See H. Schiller, 1996.

  CHAPTER 1. A SOCIETY, SEARCHING

  1. See UN Women 2013.

  2. See Diaz, 2008; Segev, 2010; Nissenbaum and Introna, 2004

  3. See Olson, 1998; Berman, 1971; Wilson, 1968; and Furner, 2007.

  4. See Daniels, 2009, 2013; Davis and Gandy, 1999.
r />   5. See Halavais, 2009, 1–2.

  6. See Angwin et al., 2016.

  7. See O’Neil, 2016, 8.

  8. See Levin, 2016.

  9. See Kleinman, 2015.

  10. The debates over Google as a monopoly were part of a congressional Antitrust Subcommittee hearing on September 21, 2011, and the discussion centered around whether Google is causing harm to consumers through its alleged monopolistic practices. Google has responded to these assertions. See Kohl and Lee, 2011.

  11. See Ascher, 2017.

  12. See Leonard, 2009.

  13. See Daniels, 2009, 2013; Brock, 2009.

  14. See Kendall, 2002.

  15. See Brock, 2009.

  16. See S. Harding, 1987, 7.

  17. See chapter 2 for a detailed discussion of the “Jewish” disclaimer by Google.

  18. See hooks, 1992; Harris-Perry, 2011; Ladson-Billings, 2009; Miller-Young, 2007; Sharpley-Whiting, 1999; C. M. West, 1995; Harris, 1995; Collins, 1991; Hull, Bell-Scott, and Smith, 1982.

  19. See Collins, 1991; hooks, 1992; Harris, 1995; Crenshaw, 1991.

  20. See Brock, 2007.

  21. The “digital divide” is a narrative about the lack of connectivity of underserved or marginalized groups in the United States that stems from the National Telecommunications and Information Administration report on July 8, 1999, Falling through the Net: Defining the Digital Divide.

  22. See Inside Google 2010.

  23. See Fallows, 2005; Purcell, Brenner, and Rainie, 2012.

  24. A detailed discussion of this subject can be found in a Google disclaimer about the results that surface when a user searches on the word “Jew.” The URL for this disclaimer (now defunct) was www.google.com/​explanation.html.

  25. Senate Judiciary Committee, Subcommittee on Antitrust, Competition Policy and Consumer Rights, 2011.

  26. See Elad Segev’s work on Google and global inequality (2010).

  27. A good discussion of the ways that Google uses crowdsourcing as an unpaid labor pool for projects such as Google Image Labeler can be found in the blog Labortainment at http://labortainment.blogspot.com (last accessed June 20, 2012).

  28. See the work of Cameron McCarthy, professor of education at the University of Illinois at Urbana-Champaign (1994).

  29. See Nissenbaum and Introna, 2004; Vaidhyanathan, 2011; Segev, 2010; Diaz, 2008; and Noble, 2014.

  30. This process has been carefully detailed by Levene, 2006.

  31. Blogger, Wordpress, Drupal, and other digital media platforms make the process of building and linking to other sites as simple as the press of a button, rather than having to know code to implement.

  32. See Spink et al., 2001; Jansen and Pooch, 2001; Wolfram, 2008.

  33. See Markey, 2007.

  34. See Ferguson, Kreshel, and Tinkham, 1990.

  35. See Wasson, 1973; Courtney and Whipple, 1983.

  36. See Smith, 1981.

  37. See Bar-Ilan, 2007.

  38. Google’s official statement on how often it crawls is as follow: “Google’s spiders regularly crawl the Web to rebuild our index. Crawls are based on many factors such as PageRank™, links to a page, and crawling constraints such as the number of parameters in a URL. Any number of factors can affect the crawl frequency of individual sites. Our crawl process is algorithmic; computer programs determine which sites to crawl, how often, and how many pages to fetch from each site. We don’t accept payment to crawl a site more frequently.” See Google, “About Google’s Regular Crawling of the Web,” accessed July 6, 2012, http://​support.google.com/​webmasters/​bin/​answer.py?hl=en&answer=34439.

  39. Brin and Page, 1998a: 110.

  40. Ibid.

  41. Brin and Page, 1998b, 18, citing Bagdikian, 1983.

  42. On June 27, 2012, the online news outlets The Local and The Raw Story reported on Google’s settlement of the claim based on concerns over linking the word “Jew” with popular personalities. See AFP, 2012.

  43. Anti-Defamation League, 2004.

  44. Ibid.

  45. See Zittrain and Edelman, 2002.

  46. See SEMPO, 2010.

  47. A website dedicated to the history of web memes attributes the precursor to the term “Google bombing” to Archimedes Plutonium, a Usenet celebrity, who created the term “searchenginebombing” in 1997. For more information, see “Google Bombing,” Know Your Meme, accessed June 20, 2012, http://​knowyourmeme.com/​memes/​google-bombing. Others still argue that the first Google bomb was created by Black Sheep, who associated the terms “French Military Victory” to a redirect to a mock page that looked like Google and listed all of the French military defeats, with the exception of the French Revolution, in which the French were allegedly successful in killing their own French citizens. The first, most infamous instance of Google bombing was the case of Hugedisk magazine linking the text “dumb motherfucker” to a site that supported George W. Bush. For more information, see Calore and Gilbertson, 2001.

  48. Brin and Page note that in the Google prototype, a search on “cellular phone” results in PageRank making the first result a study about the risks of talking on a cell phone while driving.

  49. See SEMPO, 2004, 4.

  50. In 2003, the radio host and columnist Dan Savage encouraged his listeners to go to a website he created, www.santorum.com, and post definitions of the word “santorum” after the Republican senator made a series of antigay remarks that outraged the public.

  51. See Hindman, 2009; Zittrain, 2008; Vaidhyanathan, 2011.

  52. Steele and Iliinsky, 2010, 143.

  53. See Hindman, 2009.

  54. Ibid.

  55. See Gulli and Signorini, 2005.

  56. Federal Communications Commission, 2010.

  57. Associated Press v. United States, 326 U.S. 1, 20 (1945). Diaz (2008) carefully traces the fundamental notion of deliberative democracy and its critical role in keeping the public informed, in the tradition of John Stuart Mill’s treatise “On Liberty,” which contends that democracy cannot flourish without public debate and discourse from the widest range of possible points of view.

  58. See Van Couvering, 2004, 2008; Diaz, 2008; Noble, 2014; and Zimmer, 2009.

  59. See Lev-On, 2008.

  60. See Andrejevic, 2007.

  61. See Goldsmith and Wu, 2006.

  62. H. Schiller, 1996, 48.

  63. See Fallows, 2005; Purcell, Brenner, and Rainie, 2012.

  64. President Eisenhower forewarned of these projects in his farewell speech on January 17, 1961, when he said, “In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.” Eisenhower, 1961.

  65. Niesen, 2012.

  66. The full report can be accessed at www.pewinternet.org.

  67. See Epstein and Robertson, 2015.

  68. Purcell, Brenner, and Rainie, 2012, 2. Pew reports these findings from a survey conducted from January 20 to February 19, 2012, among 2,253 adults, age eighteen and over, including 901 cellphone interviews. Interviews were conducted in English and Spanish. The margin of error for the full sample is plus or minus two percentage points.

  69. Feuz, Fuller, and Stalder, 2011.

  70. Google Web History is designed to track signed-in users’ searches in order to better track their interests. Considerable controversy followed Google’s announcement, and many online articles were published with step-by-step instructions on how to protect privacy by ensuring that Google Web History was disabled. For more information on the controversy, see Tsukayama, 2012. Google has posted official information about its project at http://​support.google.com/​accounts/​bin/​answer.py?hl=en&answer=54068&topic=14149&ctx=topic (accessed June 22, 2012).

  71. Leigh Estabrook and Ed Lakner (2000) have conducted a national study on Internet control mechanisms used by libraries, which primarily consist of policies and user education rather than fi
ltering. These policies and mechanisms are meant to deter users from accessing objectionable content, including pornography, but also other material that might be considered offensive.

  72. See Corea, 1993; Dates, 1990; Mastro and Tropp, 2004; Stroman, Merrit, and Matabane, 1989.

  73. The Chicago Urban League has developed a Digital Media Strategy that is specifically concerned with the content and images of Black people on the Internet. See the organization’s website: www.thechicagourbanleague.org.

 

‹ Prev