DHC Weekly 2/22: Algorithms of Oppression at the DHC

This Thursday, we welcomed faculty members, library staff, and students into the DHC for a lunchtime discussion of Safiya Umoja Noble’s book Algorithms of Oppression. We were so thrilled by the turnout, and by everyone’s eager and insightful comments.

reading group attendees around the DHC tables

A large part of our discussion of Dr. Noble’s book centered on questioning roles and dynamics with regards to the creation and consumption of algorithms that too often go un-scrutinized. Professor Kim Hall challenged us to question the commonly held narrative of the lone-wolf-tech-genius, asking in whose interest it is for us to continue to imagine a single brain in a room creating algorithms, and not the teams of developers who actually do this work?

professor Kim Hall speaks, while other attendees listen

Just as the popular conception of the creators of algorithms such as Google’s often does not line up with the reality, so too are our own identities relative to companies like Google not necessarily what we believe them to be. We want to think of ourselves as customers of a product like Google search, a position that implies a certain amount of power and agency. As Dr. Noble points out, however, Google search is first and foremost an advertising platform, albeit one that presents itself as though it were an agenda-less public service. As such, the “product” in question is not the information to be found via Google search, but the views to be garnered for the advertisers; it is the advertisers who are Google’s customers, and the public (or its clicks and views) are not consumers but themselves the product being offered. 

Finally, we discussed a recent Wall Street Journal article, “A Crucial Step for Averting AI Disasters,” which details the realization many tech companies are coming to that their algorithms are biased, and that those biases can and will pull in bad press and public embarassment. This realization, according to the article, has been leading tech companies to create more diverse teams of developers, who are less likely to replicate the blind spots and biases of a white and male driven field.

Bad algorithms are bad for business — but what about what’s good for the people? We closed out our discussion by attempting, in small groups, to imagine our ideal search engines, and got back answers — particularly from the Computer Science majors in the room — centered on lateral rather than hierarchical arrays of information, data privacy and informed consent, and democratized community content moderation.

attendees sit around the dhc tables and discuss

In two weeks, on March 7th, from 12-1:30 pm we will be hosting a follow-up Wikipedia editing workshop — I hope to see you all there!

Leave a Reply

Your email address will not be published. Required fields are marked *