Safiya Umoja Noble - “Algorithms of Oppression”

Important subject matter and good data-backed observations. On the other hand, a dry and somewhat uninspired execution. But maybe it’s partially due to the topic being unpleasant to deal with?

The main argument of the book can be summarized with this (lightly paraphrased) quote: “Taking away women, African-, Hispanic-, Asian- and Native- Americans, French Canadians, lesbians, gay men, people with disabilities, anyone who isn’t Christian, one is left with a very small core.”

The problem the author highlights is that society still treats this small core as if it represented every person, the “universal man”. In particular, this point of view causes search engines (with 100% attention of the book directed towards Google) to be racist and sexist.

The author shows very well how Google is not a public good service, how PageRank is not like academic citations, and how the first page results are super-optimized towards profiting Google.

There’s talk about how algorithms can be racist by catering to this elusive core of “universal man”, with concrete examples on sexualizing ethnicities (searched like “black girls"), passing fringe opinions as fact (searches like “black on white crime” or “jews"), and so on. The author discusses the right to be forgotten, balancing society’s need to be informed with unjust constant persecution of long-past behavior. It’s especially important in light of how easy it is to game Google results (SEO, “Google bombing"), one example discussed being mugshot extortion websites. Turns out 95% of “right to be forgotten” takedown requests are from private citizens who are not politicians, public figures, or convicted criminals.

The author argues that Google is not doing nearly enough to address those problems. They are capable of so doing, as demonstrated by effective information removal when it comes to copyright infringement. It’s argued that it’s likely because media corporations are Google’s customers. Misinformation, racism, and sexism are clearly less of a priority and on multiple occasions the company disclaimed responsibility for the search results being bad. The author asks: “if Google is not responsible for their algorithms then who is?”

I enjoyed the discussion of the historical context of this idea of a “universal man”, of treating what’s outside of this small core norm as worse. Since the invention of print nationalities galvanized, making national language more consistent, and increasing need and ability to classify.

In fact, a brilliant point is made that the Library of Congress subject headings explicitly encoded its creators’ biases, using racist classes like: “the Jewish question”, “the yellow peril”, “Negros”, “illegal immigrants”. 80% of Dewey classification on religion is about christianity, some religions labeled as “primitive”.

The author argues that Google search is a realtime, mutable analogue of the Library of Congress classification. The danger of misrepresentation in classifying people targets marginalized groups most, like women or colored people. It is thus patriarchal and racist.

Finally, the author discusses how the notion of “colorblindness” in terms of race is not a solution because erasing racial identity is hurtful as well. Denying people the context of their culture, upbringing, as well as past and present injustice, is a form of racism itself. A concrete example of this problem is discussed when a black hair stylist is denied the ability to highlight her salon’s specialty on “colorblindness” grounds.

Overall, this is all important stuff. Not sure why but it took me a long time to go through it though. The academic language used, as well as the rhythm and tone of the book, were not particularly inviting to me. I recommend reading it regardless.

#Books