Since we’re talking about the racism built into algorithms this morning, I thought I could share this review I wrote for the NEA Newsletter (January 2019). The published review had to be revised down to 500 words; this is the extended version. 

34762552Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism (New York University Press, 2018).

In this accessible and deeply-researched volume, critical information studies scholar Safiya Noble (University of California, Los Angeles; co-editor of The Intersectional Internet) uses the Google search engine as case study to document and theorize the ways in which racism and sexism are embedded within the structures of information harvest and delivery by for-profit companies on the World Wide Web. Noble considers the mechanisms through which results are delivered to those who use Google search, foregrounds the power relationships that shape the nature and hierarchy of those results, and challenges readers to denaturalize the process of “Googling it” when we have a question in need of a ready and reliable answer. As a sociologist and a critical library and information science scholar, Noble weaves together a librarian’s understanding of how cataloging, classification, and research tools operate with a critical Black feminist understanding of the interlocking systems of oppression from within which these technological systems of information organization and retrieval were designed. Often assumed by both developers and the general public to be value-neutral, the typically invisible (and often proprietary) algorithms by which human beings create and access content online are inescapably shaped by these logics of oppression — logics often deemed normal, normative and therefore “neutral” to those who benefit from them, even when they are anything but.

To make her case that we are living with a hegemonic culture of “algorithmic oppression” (4), Noble walks readers through a series of examples that follow out from a catalytic moment early in her graduate school career. She describes in chapter one putting the search string “black girls” into Google search in an effort to find activities for a group of preteen girls only to be inundated with a list of racist and hypersexualized results:

The best information, as listed by rank in the search results, was certainly not the best for me, or for the children I love. For whom, then, was this the best information and who decides? What were the profit and other motives driving this information to the top of the results? How had the notion of neutrality in information ranking and retrieval … remained so unexamined and without public critique? (18)

This first chapter walks readers through the basic concepts of algorithmic search and what can (and cannot) be gleaned about Google’s development of PageRank, its proprietary algorithmic product, from early concept documents. Based on the idea, borrowed from academia, that the most influential literature is also the most often cited, PageRank began with the assumption that a link to a web page was analogous to a citation, and that the web pages with the most links were therefore the most influential and therefore best (itself an assumption about power and authority that must be critically examined). While the algorithm itself may be proprietary, the fact that we cannot analyze the mechanism of Google search at the code level does not prevent us from observing — as Noble does — that Google’s algorithms produce search results that are anything but impartial. Not only does PageRank encourage searchers to engage with advertiser content — advertisers are, after all, Google’s primary clients — but also reproduce and amplify harmful beliefs.

Chapter two delves into specific examples of such searches, and casts a skeptical glance at the efforts of Google executives to distance their company from these harms. Searches including the word “Jew,” for example, produce a high proportion of anti-Semitic content (42); image searches for “doctor” return pictures of mostly white men, while image searches for “unprofessional hairstyles for work” produce pictures of black women (83). Whether or not Google software developers set out to create an algorithm that generates and amplifies the misogynoir[1] of our culture is beside the point. “Intent is not particularly important,” Noble reminds us (90). Whether or not a white person means to be racist (or a developer means to practice misogyny) is a question that may be unanswerable. Rather, as critical information workers and consumers, we must ask — regardless of intent — who is harmed by the images and ideas circulated through Google search interactions.

Having considered the technological processes and biased, arguably harmful results of the Google search product, Noble moves on in chapters three and four to consider other ways in which Google’s dominance in our online lives operates to further marginalize the already marginalized. As in the offline world, without purposeful and ongoing efforts to combat structural oppression in online spaces, inequality persists. Chapter three explores how the Internet, as a space governed by commercial interests rather than as a noncommercial public good, can cultivate and exacerbate harmful and false ideas. When the goal is to generate clicks for advertisers, there is little incentive for search products to “intercede in the framing of the question itself,” and challenge the searcher to critically examine their own desires or beliefs (116). Chapter four raises questions of data privacy and the right to be forgotten by an Internet that never forgets, particularly as increased visibility may deepen the vulnerability of already-vulnerable populations.[2]

algoirthms thread snipAfter reading the book and turning in my review, I had some further thoughts about the way sexually-explicit materials were handled within the text. A thread sharing those thoughts may be found on my Twitter timeline here

The final two chapters of Algorithms, along with a brief epilogue that considers the harrowing challenge of our current political moment, turn from the structural problem of algorithmic oppression toward potential solutions. One key intervention is to increase critical awareness of our digital ecosystem’s biases, a project that librarians and other information workers could be particularly well-positioned to undertake. Noble also champions a “public search engine alternative” to the current commercial options (152), a government-funded check on Google’s troubling power within and over almost every aspect of our interconnected lives and livelihoods. Even if that remedy seems politically unrealistic in the near future, it may be a public works project worth fighting for.

By focusing on the ubiquitous tool of Google search, Algorithms gives those just beginning to think critically about our Internet-centric information ecosystem concrete and replicable examples of algorithmic oppression in action. For those already steeped in the rapidly-growing literature of critical librarian and information studies, Algorithms will be a valuable addition to our corpus of texts that blend theory and practice, both documenting the problematic nature of where we are and the possibility of where we might arrive in future if we fight, collectively, to make it so.


[1] “Misogynoir” is a term coined by Moya Bailey to describe the particular misogyny that Black women experience, a misogyny inextricable from the racism they experience under white supremacy. See Moya Bailey, “They Aren’t Talking About Me,” Crunk Feminist Collective, 14 March 2010. http://www.crunkfeministcollective.com/2010/03/14/they-arent-talking-about-me/.

[2] The “right to be forgotten” is also a concept generating renewed interest in archives and cultural heritage study and practice; see for example Ashley Nicole Vavra, “The Right to Be Forgotten: An Archival Perspective.” The American Archivist vol. 81, no. 1 (Spring/Summer 2018): 100-111. https://doi.org/10.17723/0360-9081-81.1.100.