Hedge Fund Mogul Isn't the Only One to Listen to Kenny Rogers' 'The Gambler' over and over

I remember the smells of PBR in the air, jocks wearing white baseball hats swizzling lacrosse sticks in their hands, and the sound of ping pong balls bouncing into plastic Solo cups like it was yesterday. But it was really the mid-'90s. This was a frequent late-night high school experience in NW Washington, DC. -- some preppy party I didn't want to be attending, and probably wasn't really welcome at. It just so happened that it was either go to these parties, or don't go to any parties at all.  If I could name a theme song for these people and their parties, it would be Kenny Rogers' 'The Gambler.' I once heard that song play at a Beer Pong session at least 10 times in a row. "One more time!" Imagine watching a room full of heaving jock guys n' gals, chanting the lyrics of the most commercialized and insipid country and western singer of the '80s for half an hour, as though they were reciting Rumi or... as though it had a drop of meaning. I guess it had meaning for them.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up