Trump, With a Vicious Temperament, Seems Eager to Hasten the Doomsday Clock

In the center of Barcelona, sits the majestic church, still unfinished, designed by Antoni Gaudi (1852-1926). It is a 20th-century masterpiece, coddled with belief and an edge of mischief. One of the first sculptures that Gaudi produced for La Sagrada Familia sits at one of its entryways. It is of the last supper. Jesus sits next to Judas. He turns toward him, but pivots a few degrees too far and looks into the eyes of those believers (and tourists) who are walking into the church. ‘You too have betrayed me,’ he says to Judas. But he is looking at the weary churchgoers and tourists when he says so. Betrayal is one thing. Complicity is another. You might not have done a bad thing, but if you didn’t do anything about it, wouldn’t that be as bad?

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up