8 Most Absurd Lawsuits of 2015

The videotaped sight, in October this year, of Hungarian camerawoman Petra Laszlo deliberately tripping refugees and their children as they rushed past her was beyond shocking. The video went viral and Laszlo apologized for her actions, sort of, claiming that, contrary to appearances, she is not a monster. She claimed she was frightened by all the panicked refugees and was instinctively protecting herself. The images, however, tell another story. The video became an Internet sensation and social media condemned Laszlo as basically the worst person on Earth. Her subsequent actions pretty much confirmed that; she decided to sue not only Facebook, for neglecting to take down worldwide criticism of her footwork; she also filed a suit against one of the refugees she tripped! According to Laszlo, the man slanderously changed his story, first blaming the police for tripping him, then identifying the camerawoman (presumably after he saw videos of her tripping him and kicking him). Laszlo’s reasoning? "My husband wants to prove my innocence. For him, it is now a matter of honor.” Huh.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up