Durst: Blather About Bosnia

Now, we, (the good guys), are threatening the Serbs, (the bad guys, (we think)), with a substantial tongue lashing not to mention a very questionable International credit rating if they don't start doing the things we want them to. We're still not sure exactly what that is, except they got to stop killing the Muslims who admittedly would be trying to kill the Serbs first if they could only get their hands on some guns which they can't because we have slapped an arms embargo against the whole place. So what's really going on is: the Rapid Reaction Force isn't reacting rapidly, the Safe Areas aren't safe and people are dying because they can't get any guns. Oh, all right, it's all starting to make sense. Now if I can only figure out what the term "UN Peacekeepers" is supposed to mean.

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up