Despite it Being Extremely Rare, When the Next Terrorist Attack Comes, Will We Be Able to Handle it?

Imagine it's six months from now. A 19-year-old man—whom we'll later learn was in communication with members of ISIL in the Middle East—walks on to the Mall in Washington on a weekend afternoon. Groups of tourists are walking about from one monument to another. He takes his backpack off his shoulders, reaches in, and removes the semiautomatic rifle he bought a month before at a gun show in Virginia, where he didn't have to submit to a background check (though it wouldn't have mattered, because his record is clean). He opens fire on the crowd, and before U.S. Park Police are able to reach him and put him down, he has killed six people and wounded eleven others. In his pocket is a note announcing his devotion ISIL, and that he is striking at the United States in retaliation for its illegal war on the true Muslims building a caliphate in Syria and Iraq.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up