The Unbelievable Inhumanity of Solitary Confinement - And Punishment for as Little as Reading a Book

The cells are the size of a parking space. In the corner is a toilet and a open steel shower stall that emits water for 15-minute periods three times a week. Through a small slot in one wall, guards slide trays of food for breakfast, lunch and dinner--if they feel like it. Sometimes the food comes covered in hair. Other days the food doesn’t come at all. The cell remains locked for 23 hours a day. During the final hour, a door slides open, allowing one to walk into a smaller outdoor kennel enclosed by concrete walls or metal grates. The pen is too small to do anything except pace back and forth, stare at the sky and listen to the din of other caged men railing against their confinement.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up