FBI's Vast Facial Recognition Database More Likely to Misidentify Innocent Blacks as Suspects

In 2010, the FBI launched Next Generation Identification, a sprawling, complex program designed to use biometric tools like facial recognition, finger and palm prints, and iris scans in criminal investigations. At the time, privacy advocates worried that the FBI would collect and use the data without adequate oversight or privacy protections, especially given the rapid advances in facial recognition technology.


Last week, a House Committee on Oversight and Government Reform found that privacy experts were right to be concerned: the FBI uses facial recognition without complying with privacy laws; 1 out of every 2 Americans’ photo is in some kind of FRT database; and facial recognition technology can reproduce race and gender bias, “misidentifying female and African American individuals at a higher rate.”

Jennifer Lynch, staff attorney at the Electronic Frontier Foundation, testified about all the ways that police can use—and misuse—facial recognition.

“Law enforcement officers can use mobile devices to capture face recognition-ready photographs of people they stop on the street; surveillance cameras boast real-time face scanning and identification capabilities; and the FBI has access to hundreds of millions of face recognition images of law-abiding Americans,” Lynch testified. “This has led to the development of unproven, inaccurate systems that will impinge on constitutional rights and disproportionately impact people of color.”

“This has real-world impact; an inaccurate system will implicate people for crimes they didn’t commit, forcing them to try to prove their innocence and shifting the traditional burden of proof away from the government,” Lynch testified. “Face recognition misidentifies African Americans and ethnic minorities, young people, and women at higher rates than whites, older people, and men, respectively.”

Research suggests that several algorithms used in FRT searches are more likely to give the wrong result when the suspect is black.

“If the suspect is African American rather than Caucasian, the system is more likely to erroneously fail to identify the right person, potentially causing innocent people to be bumped up the list—and possibly even investigated,” according to a statement by Alvaro Bedoya, head of Privacy & Technology at Georgetown Law.

“Perversely, due to disproportionately higher arrest rates among African Americans, face recognition may be least accurate for those it is most likely to affect: African Americans,” Bedoya said.

#story_page_post_article

Understand the importance of honest news ?

So do we.

The past year has been the most arduous of our lives. The Covid-19 pandemic continues to be catastrophic not only to our health - mental and physical - but also to the stability of millions of people. For all of us independent news organizations, it’s no exception.

We’ve covered everything thrown at us this past year and will continue to do so with your support. We’ve always understood the importance of calling out corruption, regardless of political affiliation.

We need your support in this difficult time. Every reader contribution, no matter the amount, makes a difference in allowing our newsroom to bring you the stories that matter, at a time when being informed is more important than ever. Invest with us.

Make a one-time contribution to Alternet All Access, or click here to become a subscriber. Thank you.

Click to donate by check.

DonateDonate by credit card
Donate by Paypal
{{ post.roar_specific_data.api_data.analytics }}

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.