‘All you want is to be believed’: The impacts of unconscious bias in health care

Why 'the market' can't regulate the trade in PPE and other vital pandemic supplies
U.S. Navy Lt. Gail Evangelista, nurse, assigned to Naval Hospital Rota, Spain, dons a facemask prior to interacting with a patient at the Michaud Expeditionary Medical Facility (EMF) at Camp Lemonnier, Djibouti, April 16, 2020. Evangelista is part of a four-member team sent by Naval Forces Africa to augment critical positions within the EMF during the COVID-19 pandemic, enabling existing EMF staff to execute their primary mission of treating trauma patients. (U.S. Air Force photo by Senior Airman Dylan Murakami)

In mid-March, Karla Monterroso flew home to Alameda, California, after a hiking trip in Utah’s Zion National Park. Four days later, she began to develop a bad, dry cough. Her lungs felt sticky.The fevers that persisted for the next nine weeks grew so high — 100.4, 101.2, 101.7, 102.3 — that, on the worst night, she was in the shower on all fours, ice-cold water running down her back, willing her temperature to go down.“That night I had written down in a journal, letters to everyone I’m close to, the things I wanted them to know in case I died,” she remembered.Then, in the second month, came a ...

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up