Democrats need a plan to beat back Trump's election lies at the ballot box

Image via Shutterstock.

Last week, to very little fanfare, House Democrats released their 2020 "after action report," also known as an "autopsy." The team, led by Rep. Sean Maloney, D-N.Y., included Reps. Jim Himes D-Conn., Katie Porter D-Ca. and Nikema Williams, D-Ga., and was tasked with finding out how the House managed to lose so many seats in an election in which the Democratic nominee managed to unseat an incumbent Republican president. Working with senior staff, Demnocrats analyzed the voter files from the presidential election and other state and local data and compared them with 600 different House race polls in 2020. According to this report in the Washington Post, they didn't really find anything that most observers hadn't already assumed from the results.

It turns out that Democrats underestimated the number of hardcore Trump lovers, which they surmised made the "defund the police" and "socialism" lies more potent in the swing districts. That underestimation is attributed to bad polling, which has been validated by pollsters themselves. Many Republicans just aren't responding anymore and the pollsters failed to successfully weigh their polls accordingly. (This has been going on for a while and really needs to be dealt with.) Maloney told the caucus that such faulty polling led them to spend too much time and money on "red-to-blue" districts and not enough to defend their incumbents in what turned out to be tight races.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up