The House impeachment managers make a searing case against Trump — but they're leaving out an ugly truth

Rep. David Cicilline

In the hours before the second day of Donald Trump's impeachment trial, Hillary Clinton tweeted her blunt assessment of the situation: "If Senate Republicans fail to convict Donald Trump, it won't be because the facts were with him or his lawyers mounted a competent defense. It will be because the jury includes his co-conspirators." The House impeachment managers then spent the rest of the day proving Clinton correct — even as they insisted that they were trying Trump and not the larger GOP.

In a careful case, the Democratic House managers laid out an astonishing amount of public evidence showing that Trump spent months riling up his supporters before he sicced them on the Capitol on Jan. 6 in an attempt to overturn the results of the 2020 election. It was such a slam-dunk case that, quite literally, the only reason any senator would vote to acquit is outright complicity with Trump's insurrection.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up