Why Aren’t We Using This Gun Technology That Could Save Hundreds of Children’s Lives?

Every year in the United States, over 7,000 children and teenagers are sent to the ER because of shooting accidents. Another 3,000 die before they even make it to a hospital. These numbers—taken from a recent study by the journal Pediatrics—are horrific, but they’re unlikely to change any time soon. Children have access to guns in their own homes, in the homes of their friends and families, and even, in some states, at shooting ranges. More than one third of all U.S. households have guns, and a study published in the Journal of Trauma found that “children 5-14 years old were more likely to die from unintentional firearm injuries, suicides and homicides if they lived in states with more rather than fewer guns.” Put simply: access and exposure to firearms increases the chances that a kid or teen will be involved in an unintentional shooting incident. According to the Centers for Disease Control and Prevention, this is one of the leading causes of accidental death for children between the ages of 1-14.

Read More Show less
ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up