The lessons of two failed wars

Sgt. Logon Ross, left, and Sgt. Addison Owen, right, Company B, 1st Battalion, 26th Infantry Regiment, Task Force Strike, 101st Airborne Division (Air Assault) in the U.S compound at the Qayyarah West Airfield, Iraq, Nov. 1, 2016. Company B provides security for Coalition forces on the base and was one on the first units at the location. Owen is an infantryman at the base and is on his second deployment to Iraq. On his first deployment in the city of Basra during 2010-2011, Owen was wounded when an explosively formed penetrator hit his vehicle. (U.S. Army photo by 1st Lt. Daniel Johnson)

In choosing a title for his final, posthumously published book, the prominent public intellectual Tony Judt turned to a poem by Oliver Goldsmith, The Deserted Village, published in 1770. Judt found his book's title in the first words of this couplet:

Ill fares the land, to hastening ills a prey
Where wealth accumulates, and men decay

A poignant sentiment but let me acknowledge that I'm not a big Goldsmith fan. My own preferences in verse run more toward Merle Haggard, whose country music hits include the following lyric from his 1982 song "Are the Good Times Really Over?":

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up