Media Corrections We’d Like to See

Former readers of Mad Magazine can remember a regular feature called "Scenes We'd Like to See." It showed what might happen if candor replaced customary euphemisms and evasions. These days, what media scenes would we like to see?

One aspect of news media that needs a different paradigm is the correction ritual. Newspapers are sometimes willing to acknowledge faulty reporting, but the "correction box" is routinely inadequate -- the journalistic equivalent of self-flagellation for jaywalking in the course of serving as an accessory to deadly crimes.

Some daily papers are scrupulous about correcting the smallest factual errors that have made it into print. So, we learn that a first name was misspelled or a date was wrong or a person was misidentified in a photo caption. However, we rarely encounter a correction that addresses a fundamental flaw in what passes for ongoing journalism.

Here are some of the basic corrections that we'd really like to see:

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up