2006 XMA$ GIFT WI$H LI$T

Bah humbug everybody. Consider that uttered in the spirit of those of us familiar with the soft dark underbelly of the happiest time of the year. The ones regularly washed over by the holiday faucet of red and green bile dreading the solstice celebration as it drips down the drain of melancholy revealing the regurgitated fruit of of our greed and gluttony. But then again, what the hell. Pass me a cookie and another glass of nog and let's just enjoy the whole thing, shall we? And go easy on the nutmeg and heavy on the whiskey, mister. Because its time to just sit back and relax.

Xmas is still with us, as we are repeatedly reminded by the television ads partially obscured by the coffee table high wrapping paper detritus. So to honor all you brave and steadfast consumers who set new records this year in your patriotic quest to sink heavily into debt to honor the birth of that Jewish hippie kid, let me offer up to the least deserving of us my annual scathingly incisive yet perennially trenchant: Will Durst's 2006 XMA$ GIFT WI$H LI$T.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up