Here's why experts are increasingly worried about new coronavirus mutations

With the COVID-19 coronavirus having killed more than 2 million people worldwide and over 402,000 in the United States (according to Johns Hopkins University in Baltimore), the distribution of new vaccines from Pfizer, Moderna and others is very good news. Health experts, however, are worried about new COVID-19 variants that have emerged in different parts of the world, and science reporter Sarah Zhang examines some of the most troubling things about those variants in an article published by The Atlantic this week.

"For most of 2020," Zhang explains, "the coronavirus that causes COVID-19 jumped from human to human, accumulating mutations at a steady rate of two per month — not especially impressive for a virus. These mutations have largely had little effect. But recently, three distinct versions of the virus seem to have independently converged on some of the same mutations, despite being thousands of miles apart in the United Kingdom, South Africa and Brazil."

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up