Crackdown Begins: Food Not Bombs House Among Saturday Raids Ahead of RNC


Minnesota Indy RNC reporter Jeff Severns Guntzel is at the Minneapolis Food Not Bombs house, which was raided by police this morning. Facts are still coming in, but Guntzel says that at 8 a.m. neighbors near the home, located at 2301 23rd Avenue South, reported hearing a loud bang followed by yelling. A single police squad car was parked out front. When Guntzel arrived he saw eight or nine officers enter the house in what he says is a joint operation between officers of the Ramsey County Sheriff's Department, the Minneapolis Police Depatment, and the FBI. According to one witness who was in the house at the time of the raid, the action is related to last night's raid on the RNC Welcoming Committee's "convergence space." Several other spaces have been raided this morning.

Around 9:20, two Minneapolis Police Property & Evidence trucks pulled up. Present were a Hennepin County Sheriffs' crime lab truck, a Ramsey County Sheriffs' squad and an MPD squad, plus at least four unmarked cars parked facing the wrong direction in traffic. Police tape is marking off the yard.
Minneapolis Police Department officers stand guard outside the Food Not Bombs house. Photo: Jeff Severnz Guntzel

Two men are released from the Food Not Bombs house as reporters, neighbors, and supporters look on. Photo: Jeff Severns Guntzel

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up