On Proportionality, Civilian Casualties, and Why Israel Has Already Lost

Once again, I watch in horror as the Israeli military pounds a densely populated area in "self defense," killing civilians, restricting aid, and causing a humanitarian disaster. And I wonder again why Israel, its supporters, and the Western media just don't get it.

Yes, Hamas fired rockets into Israel. Yes, Israel has the right to defend itself from such attacks. But by bombarding and invading Gaza, killing hundreds of civilians, destroying infrastructure, and blocking aid to Gaza, Israel has abandoned the moral high ground and has become worse than the Hamas terrorists they decry. And that's why Israel has already lost.

As of today, the United Nations estimates that approximately 165 Palestinian civilians have been killed, representing 25 percent of all Palestinian casualties. Since fighting began, 4 Israeli civilians have been killed by Hamas rockets.

The problem is proportionality, a word I've actually been happy to hear used commonly in this discussion. Former secretary of defense Robert McNamara talks about it in Errol Morris's excellent Oscar-winning documentary The Fog of War and how "proportionality should be a guideline in war." I agree.

Maybe it's because I'm naïve that I don’t see how you can claim the moral high ground and that you are both good guy and victim when your side is killing 41 times more innocent civilians than the “terrorists.” Maybe it's because I am not an expert on the Middle East, so I don't understand why killing 41 civilians is a proportional, appropriate, and legal way to show that killing a single civilian is wrong.

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up