Dying at home: Disturbing data suggests official coronavirus deaths are 'just the tip of the iceberg'

Airmen from the 18th Medical Group conduct COVID-19 testing at Kadena Air Base, Japan, March 20. Under the most current guidance from the Centers for Disease Control, the 18 MDG has increased its testing for the disease. Those who are tested become Persons Under Investigation (PUI), are contacted by Public Health, placed into isolation and instructed on how to avoid spread of their illness to family members in the home. Public Health interviews the PUI and develops a list of “close contacts” who are then called and given instructions to quarantine for 14 days. A close contact is someone who lived with or cared for a PUI, had direct physical contact with a PUI, or shared eating utensils or had prolonged close conversation with a PUI. On average, lab results take 2-5 days to return. If results are negative, isolated and quarantined individuals will be notified and released. If results come back positive, quarantine for the close contact will continue for 14 days and isolation for the PUI will continue until the PUI is medically cleared. Someone who has had contact with someone deemed a close contact does not need to be placed in quarantine but should continue to practice social distancing. (U.S. Air Force photo by Senior Airman Rhett Isbell)

ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for The Big Story newsletter to receive stories like this one in your inbox.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up