Algorithms of Oppression

Safiya Noble on President’s Speakers Series

“As today’s speaker, Safiya Umoja Noble, writes in her new book, Algorithms of Oppression: How Search Engines Reinforce Racism, and I quote, ‘We are increasingly being inculturated to the notion that digital technologies, particularly search engines, can give us better information than other human beings can.

“People will often take complex questions to the web, to a google search, rather than going to the library or taking a class on the subject. The idea is the answer can be found in .3 seconds to questions that have been debated for thousands of years,’” opened California State University, Monterey Bay (CSUMB) President Eduardo Ochoa for Dr. Safiya Noble on Wednesday, Nov. 28 at the World Theater.

Dr. Safiya Noble, an associate professor at UCLA, USC and best-selling novelist, came to CSUMB to address the main topic of her new book, Algorithms of Oppression. Noble spoke on key factors that contribute to search engines, specifically Google, reinforcing racial stereotypes. Her speech outlined many examples of how Google searches can lead to racist outcomes based on the algorithms that are embedded in these search engines.

Noble gave many examples of search engines, such as Google, leading people to racist search results or disinformation. The first example coming from Google maps during the Obama Administration, where Dr. Noble says when the term “N-word house” would lead people to directions towards the White House. Another example given was discovered, not by Noble but by Twitter user, Kabir Ali. Ali found that when googling the term “three black teenagers,” the Google images that were found were all mugshots of black criminals, whereas the googling of “three white teenagers” led to stock photos of white teenagers modeling with sports equipment.

With little to no response from Google about these search results, Noble explained that these algorithms are, in fact, producing racial stereotypes. “We have more data and technology than ever, and more social, political and economic inequality and injustice to go with it,” said Noble, quoting her novel. Noble explains, in one of her slides, the ‘theoretical frameworks’ behind her research outlining the social construction of technology and black feminist and critical race theory. A few key bullet points read, “Technology is a social construction, embedded with social and political values. Power relations are based on our historical, social, and economic positions.”

In the next portion of her speech, Noble addressed what she called, the Case of Dylann “Storm” Roof. Roof was convicted of murdering nine African Americans in a Charleston Church shooting in 2015. Noble explains how Roof’s decision to murder innocent people came from a result of his Google searches. A manifesto by Roof was found shortly after the shooting, outlining his journey which lead him to murder. “The event that truly awakened me was the Trayvon Martin case. I kept hearing and seeing his name, and eventually decided to look him up. I read the Wikipedia article and right away was unable to understand what the big deal was. It was obvious Zimmerman was in the right.

“But more importantly this prompted me to type the words ‘black on white crime’ into Google, and I have never been the same since that day. The first website that I came to was the Council of Conservative Citizens…from here I found out about the Jewish problem and other issues facing our race, and I can say today that I am completely racially aware,” read Roof’s manifesto.

Noble explains that the searches readily available to Roof when he searched “black on white crime” were all white supremacist websites disguised to look factual. Roof would have simply needed to search “white on black crime” to get the more realistic picture of crime in the United States.

In response to the lack of change from Google, Noble says, “We see with the intense calls for Facebook and Google, in particular, to be held to account for the crimes of information that move through their systems that this is not going to be an issue that goes away anytime soon. And this is a very important time for us to be thinking about algorithms and automated decision making systems that really are not sophisticated enough to recognize certain types of threats.”

“Often times we find ourselves in meetings, in conferences, with people that say [Artificial Intelligence] AI is going to solve these problems, that AI is going to recognize the threats or the disinformation…in fact, I think we heard that from Mark Zuckerberg in his testimony to Congress a few months ago. And yet, we know that AI is actually still trying to figure out if this podium, is a podium…so I’m not really sure how we’re going to get to these more complex, set decisions from AI, but we’re certainly a long ways away from that,” said Noble.

Leave a Reply

Recent Articles

The Performing Otters put on a great show: “The Addams Family”

At least 60 to 80 people gathered in the World Theater for a show by The Performing Otters, “The Addams Family” on Saturday, April...

Simple mortal

By José Guzman To be human self as strongFar from it, all can see.Crystal clear human flesh easily pierced.Dagger slices softly,Flesh exposed, blood drip-drop-drips steadily. They...

Someone else

It was never you, was it?Your footsteps in the atticYour voice, humming songs to thin air That was another man's ghostwith cold eyes and bloodstained...

The concert of the year: Otterlands

On April 19, Otterlands, Cal State Monterey Bay’s (CSUMB) premier annual concert hosted by Associated Students (AS) brought in over 350 eager attendees in...

Related Articles

Discover more from The Lutrinae

Subscribe now to keep reading and get access to the full archive.

Continue reading