It Ain't Easy Peeing Green

Honey, could you please bring me the tissues out of my bag?" I called from the bathroom in the rundown backpackers' hostel. Dan and I had paid two extra American dollars for en suite facilities, and I'd sat down on the toilet without noticing that there was nothing to wipe with. Tiny ants patrolled the cracks between the sink and the wall and the wall and the floor. A few lizards took turns scurrying across the ceiling. I eyed them sharply.

"What for?" Dan asked through the door.

"What do you mean, 'what for'?" I called back, laughing quietly in spite of myself.

From the moment our escape-the-States-before-the-careers-and-babies trip started, my intended and I spent a lot of time talking about toilets. We had recently graduated from college and set off on a splendid six-month vacation that would culminate in a Fijian wedding. We were free of mortgage and debt obligations. We had our youth. We had big dreams and birth control. Before we left, Dan had taken a Southeast Asia guidebook out of the library and given me a quick course in distant culture. I'd learned, among other things, that people in Thailand, our first stop, don't traditionally use toilet paper. But I'd forgotten.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up