Dean rips Bush, GOP, Roberts (video)

Mostly paraphrasing because you can just go watch the [VIDEO] to get your fix...

On Katrina: After Dean faults Bush's gutting of FEMA and his pathetic response Colmes says: "the president did take personal responsibility..."

To which Dean responds: Well, that's one thing, but a lot of people are dead.

Yeah, that's another, isn't it?

On the estate tax (or as house liberal Alan Colmes puts it: the death tax): it was a moral choice and we made the wrong one.

On Bush's low approval ratings: This is the most divisive president since perhaps before the civil war and he got there by not telling the truth.

On Roberts and how the president's policies promote racism, homophobia, and injustice toward women.

On Roberts hearings: I didn't hear any answers and a lot of legal mumbo jumbo...

The best part is, this video of Howard Dean demolishing the president on Katrina, Republican immorality and Roberts' mumbo jumbo answers is, according to Susanhu, the second most popular video on Fox News' website. What do you think? Masochism or a change a comin'? (Booman Tribune & Crooks and Liars)

--> Sign up for Peek in your inbox... every morning! (Go here and check Peek box).

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up