Amazon Pitched ICE to Buy Its Facial Recognition Surveillance Technology: Report

Although deportations of undocumented immigrants were plentiful during the Barack Obama years, U.S. Immigration and Customs Enforcement (ICE) has become even more aggressive under the Trump Administration—and one of ICE’s allies could be

Over the summer, the e-commerce giant met with ICE officials to discuss Rekognition, Amazon’s facial recognition technology. And while selling Rekognition to a major government agency like ICE could be highly profitable for Amazon, critics of ICE’s Trump-era methods are asserting that the potential for abuses is strong.

Alonzo Peña, ICE’s former deputy director, told the Project on Government Oversight (POGO) that if ICE uses Rekognition, possible overreach “should be an area of concern.”

Peña asserted, “If they have this technology, I can see it being used in any way they think will help them increase the numbers of detentions, apprehensions and removals.”

Peña also told POGO that under the Trump Administration, ICE’s methods have changed considerably from what they were under the Obama Administration. According to Peña, “In the past, certain areas like schools, churches and courts were off limits. There were policies in place that would prevent agents from going into those areas, but under this administration, a lot of those policies are no longer enforced.”

According to POGO, ICE agents visited Silicone Valley in June and met with Amazon representatives at the Redwood, California offices of the consulting firm McKinsey & Company (which, previously, had a management contract with ICE). And on June 15, an Amazon Web Service (AWS) rep sent a thank-you e-mail to ICE’s Homeland Security Investigations (HIS) office.  

The rep wrote, “Thanks again for your interest in AWS to support ICE and the HSI mission…. If there’s interest in further exploration, we can schedule a meeting to review the process in more depth and help assess your target list of challenges.”

When POGO asked ICE how many times they had met with Amazon, an ICE spokesperson said, in an e-mail, “We can’t provide data on how often we’ve met with a particular vendor to discuss emerging technology they’re developing, but industry outreach and building relationships with potential contractors is fairly standard within government acquisition.”

In May, the American Civil Liberties Union (ACLU) noted that some local police departments had been using Rekognition—including the Washington County Sheriff’s Office in Oregon. And the ACLU was critical, describing Rekognition as “a product that can be readily used to violate civil liberties and civil rights…. Amazon’s Rekognition raises profound civil liberties and civil rights concerns. Today, the ACLU and a coalition of civil rights organizations demanded that Amazon stop allowing governments to use Rekognition.”

Some Amazon employees have been wary of Rekognition as well. In an anonymous commentary for, an Amazon employee complained, “We know from history that new and powerful surveillance tools left unchecked in the hands of the state have been used to target people who have done nothing wrong. In the United States, a lack of public accountability already results in outsized impacts and over-policing of communities of color, immigrants and people exercising their First Amendment rights. Ignoring these urgent concerns while deploying powerful technologies to government and law enforcement agencies is dangerous and irresponsible.”

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up