FBI's Vast Facial Recognition Database More Likely to Misidentify Innocent Blacks as Suspects

In 2010, the FBI launched Next Generation Identification, a sprawling, complex program designed to use biometric tools like facial recognition, finger and palm prints, and iris scans in criminal investigations. At the time, privacy advocates worried that the FBI would collect and use the data without adequate oversight or privacy protections, especially given the rapid advances in facial recognition technology.


Last week, a House Committee on Oversight and Government Reform found that privacy experts were right to be concerned: the FBI uses facial recognition without complying with privacy laws; 1 out of every 2 Americans’ photo is in some kind of FRT database; and facial recognition technology can reproduce race and gender bias, “misidentifying female and African American individuals at a higher rate.”

Jennifer Lynch, staff attorney at the Electronic Frontier Foundation, testified about all the ways that police can use—and misuse—facial recognition.

“Law enforcement officers can use mobile devices to capture face recognition-ready photographs of people they stop on the street; surveillance cameras boast real-time face scanning and identification capabilities; and the FBI has access to hundreds of millions of face recognition images of law-abiding Americans,” Lynch testified. “This has led to the development of unproven, inaccurate systems that will impinge on constitutional rights and disproportionately impact people of color.”

“This has real-world impact; an inaccurate system will implicate people for crimes they didn’t commit, forcing them to try to prove their innocence and shifting the traditional burden of proof away from the government,” Lynch testified. “Face recognition misidentifies African Americans and ethnic minorities, young people, and women at higher rates than whites, older people, and men, respectively.”

Research suggests that several algorithms used in FRT searches are more likely to give the wrong result when the suspect is black.

“If the suspect is African American rather than Caucasian, the system is more likely to erroneously fail to identify the right person, potentially causing innocent people to be bumped up the list—and possibly even investigated,” according to a statement by Alvaro Bedoya, head of Privacy & Technology at Georgetown Law.

“Perversely, due to disproportionately higher arrest rates among African Americans, face recognition may be least accurate for those it is most likely to affect: African Americans,” Bedoya said.

Enjoy this piece?

… then let us make a small request. AlterNet’s journalists work tirelessly to counter the traditional corporate media narrative. We’re here seven days a week, 365 days a year. And we’re proud to say that we’ve been bringing you the real, unfiltered news for 20 years—longer than any other progressive news site on the Internet.

It’s through the generosity of our supporters that we’re able to share with you all the underreported news you need to know. Independent journalism is increasingly imperiled; ads alone can’t pay our bills. AlterNet counts on readers like you to support our coverage. Did you enjoy content from David Cay Johnston, Common Dreams, Raw Story and Robert Reich? Opinion from Salon and Jim Hightower? Analysis by The Conversation? Then join the hundreds of readers who have supported AlterNet this year.

Every reader contribution, whatever the amount, makes a tremendous difference. Help ensure AlterNet remains independent long into the future. Support progressive journalism with a one-time contribution to AlterNet, or click here to become a subscriber. Thank you. Click here to donate by check.

DonateDonate by credit card

Close

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.