The dark roots of Republican lies

Office of Congresswoman Elise Stefanik

I wish more people understood the democratic nature of facts. They are things you, me and everyone we know can engage. We can interpret them. We can use them for the ends we choose to pursue. But facts are like public property. Morally, I cannot exercise exclusive control over them, because they are everyone's. Because of that, they demand everyone act virtuously. They demand that a majority takes care of them, takes responsibility for their well-being, for the sake of community and the common good.

I don't mean to sound like an optimist. I live in the same world you do. That there are individuals here who would use facts in bad faith for immoral purposes is all the more reason to remind everyone of the democratic nature of facts. Everyone has a stake in their use, whether individuals know it or not. Our political community is injured by their abuse. Facts are not only democratic. They are fundamentally equitable. They put us all on a level playing field by eliciting our respect for them. Liars may think they're individualists getting away with something. But they hurt themselves, too, in the end.

Read More Show less

This article was paid for by AlterNet subscribers. Not a subscriber? Try us and go ad-free for $1. Prefer to give a one-time tip? Click here.

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up