Recall George

Deanna already linked to the video on The Mix, but here are my favorite bits from Bill Maher's rant on George Bush:

Now, I kid, but seriously, Mr. President, this job can't be fun for you anymore. There's no more money to spend. You used up all of that. You can't start another war because you also used up the army. And now, darn the luck, the rest of your term has become the Bush family nightmare: helping poor people.

Yeah, listen to your mom. The cupboard's bare, the credit card's maxed out, and no one is speaking to you: mission accomplished! Now it's time to do what you've always done best: lose interest and walk away. Like you did with your military service. And the oil company. And the baseball team. It's time. Time to move on and try the next fantasy job. How about cowboy or spaceman?! ...

Herbert Hoover was a shitty president, but even he never conceded an entire metropolis to rising water and snakes.

On your watch, we've lost almost all of our allies, the surplus, four airliners, two Trade Centers, a piece of the Pentagon and the City of New Orleans...Maybe you're just not lucky!

I'm not saying you don't love this country. I'm just wondering how much worse it could be if you were on the other side. So, yes, God does speak to you, and what he's saying is, "Take a hint." [TRANSCRIPT]

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up