Hey Barack, guess where Northern Iraq is?

Dem rock-star Barack Obama called for a "gradual withdrawal" of U.S. forces, without a firm end date, "linked to conditions on the ground in Iraq and based on the advice of U.S. commanders."

Could you get any mushier?

And the troops aren't necessarily coming home. To where might they withdrawal? "He proposed redeploying troops to Northern Iraq and to other countries in the region."

God bless Obama, but I don't think he's really getting this whole withdrawal concept.

I actually have heard one analyst suggest that the U.S. could pull its troops into the relatively friendly Kurdish-controlled areas in the hope of salvaging something. If the Baker Commission is considering whether to "go long," "go deep" or "go home," this plan would be "go build a tree-house and hide from the bad guys."

Anyway, while some people I respect favor a gradual withdrawal without a specific end date, it seems to me that it would be the worst of both worlds; it decreases U.S. forces' ability to keep major battles from breaking out but it doesn't give the Iraqi government the boost in legitimacy that a date-certain might -- might -- give it (especially if we have them demand it) and it doesn't provide Iraqis a guarantee that the U.S. won't maintain some permanent bases in the country.

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up