$5,600,000,000 (video)

A couple of years back, the curmudgeonly Rooney, whom my mother loves, flipped out on TV. It wasn't widely reported; it wasn't widely noticed.

It was just prior to, or immediately following, the bombing of Iraq and he seemed to be flashing back to WWII. A Frenchman was giving him a hard time, he recalled, and he barked something about that Frenchman speaking German were it not for America. True enough I suppose but the anti-France bandwagon hopping was a huge disappointment and cooled me on the man.

Well, better late than never I suppose.

Trying to grasp just what $5,600,000,000 actually is, Rooney touches on the way we pay for such massive expenditures (borrowing from countries like China) and catalogs the services cut, asking: "Do these sound like things that you'd like to cut back on?" [VIDEO]

He then compares worldwide military spending culminating in a warning from a past president. It's simple, crusty, effective. It's why my mother loved him. (Crooks & Liars has the video)

--> Sign up for Peek in your inbox... every morning! (Go here and check Peek box).

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up