Ilhan Omar reveals disturbing threat received after attacks from Lauren Boebert and Fox News

When Rep. Ilhan Omar of Minnesota was elected to the U.S. House of Representatives in 2018, the progressive Democrat expected to have frequent policy differences not only with Republicans, but also, with centrists in her own party. However, attacks on Omar from the far right — including Rep. Marjorie Taylor Greene of Georgia, Rep. Lauren Boebert of Colorado and pundits at Fox News — go way beyond policy disagreements, painting her as fundamentally anti-American and an outright enemy of the United States. And on June 9, Omar tweeted a racist, threatening message her office received following a tweet by Boebert and the June 7 publication of an anti-Omar article by Houston Keene on Fox News' website.

In a June 8 tweet, Boebert — a supporter of the far-right QAnon conspiracy cult — described Omar as an "honorary member of Hamas" and one of the "terrorist sympathizers in Congress." Members of QAnon were among the violent extremists involved in the January 6 insurrectionist assault on the U.S. Capitol Building.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up