Train to Downing Street, all aboard

Michael Kinsley decided to give the Downing Street Memo a necessary media boost this weekend with a big "so what?" column that set off a series of responses from Kevin Drum, Juan Cole and others. Memeorandum collected a bunch. Think Progress also talks about Walter Pincus' Washington Post story and the lack of a plan for post-invasion Iraq.

Also in DSM news: Crooks and Liars collected a video produced by the DowningStreetMemo.com folks, who say: "It's tempting in the flurry of minutes and memos and articles to get so wrapped up in the story that you forget exactly who the story is about [my emphasis]... It's not about Republicans and Democrats. It has, and always will be, about them. About those heroes who sacrificed and trusted their government. About those with brave hearts who cannot speak now from the grave. And so, it is up to us to speak for them." (Memeorandum, Think Progress; DowningStreetMemo.com via Crooks and Liars)

UPDATE: Think Progress came through again with a post criticizing the New York Times' memo coverage this morning; Salon's War Room also nails them. (Think Progress, Salon Politics War Room)
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up