Hearing on coronavirus ends abruptly as White House tells experts to come to 'emergency meeting'

Fogarty held its 50th anniversary symposium, "What are the new frontiers in global health research?" on May 1, 2018, at NIH in Bethesda, Maryland. The tools exist to bring the end of HIV/AIDS but implementation must be improved, said National Institute of Allergy and Infectious Diseases Director Dr. Anthony S. Fauci.

On Wednesday morning, medical experts, including National Institute of Allergy and Infectious Diseases director Dr. Anthony Fauci and the director of the Centers for Disease Control and Prevention, Dr. Robert Redfield, were testifying before the House Oversight Committee on what to expect from the coronavirus epidemic in the United States. According to Fauci, “The bottom line: It is going to get worse.” Again and again, the information provided in the hearing completely contradicted the rosy statements that have been coming from Donald Trump and other White House officials and warned of a dire situation ahead.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up