Five Women Buried Alive -- and the Media Ignore It

Last month, the U.S. media were full of stories about the resignation of Pervez Musharraf as president of Pakistan. But another event that same week in Pakistan -- that tribesmen buried five young women alive for wanting to choose their own husbands -- got almost no coverage.

According to the Asian Human Rights Commission, the women's "crime" was that they defied tribal elders and arranged marriages to men of their own choosing in a civil court. They were abducted at gunpoint by some men and dragged off to a remote field, where they were beaten, shot, thrown into a ditch, and then, while still breathing, smothered to death with rocks and mud.

Yet not even when a member of the Pakistani parliament, Israr Ullah Zehri, defended these barbaric killings as "century-old traditions" -- when he said that killing women who defy male control by wanting to chose their own husbands is necessary to "stop obscenity" -- was there international outrage.

Why is this? And why is there no international outrage about the fact that violence against women and female children is indeed a "century-old tradition"?

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up