Israelis Kill Palestinian Child--or do they? Pictures can lie.

I just received a fascinating video from a friend, Norbert Majerholc, a perfect demonstration of how deceptive television coverage can be. It's the record of how two of France's major TV channels reported the same tragic incident in a refugee camp in Nablus on October 27, 2004.

But, I suggest you read on before downloading the file.

For those who don't speak French, I've translated each report.

According to the first voice-over from France's FR 3:

"An Israeli vehicle enters the refugee camp of Balata in Nablus this morning. Officially they are taking part in a clean up operation to capture suspects. In the street, some young Palestinians begin to throw rocks in the direction of the Israeli army troops, who reply with real bullets. A Palestinian woman comes out of her house, screaming, her child in her arms. Her son has just been hit by a mortal bullet in the neck. Little Khaled was six years old."

That's how the report ends.

Must have been the Israelis who shot the kid, right?

Wrong. A cameraman for France's TF1, covered the same Israeli jeep as it entered the camp, but from a different angle:

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up