7 Privacy Threats the Constitution Can't Protect You Against
Continued from previous page
The big issue, as EFF's Tien tells AlterNet, is that these surveillance tools muddy the legal barriers between public and private that are at the heart of constitutional protections, because they're so much more sophisticated than human observation. In other words, a person might expect that what they do on a street corner can be observed, but not that their actions might be logged in a database or that they can be tracked for a month.
As cameras become more sophisticated and better able to capture higher quality images from further away, there's a push to merge surveillance with biometrics technology -- the use of unique physiological features, like facial features or iris patterns, to ascertain identity.
After 9/11 many cities and airports rushed to boost their camera surveillance with facial recognition software. The tech proved disappointing, and after testing that hit a paltry 60 percent accuracy rate in one case (that's pretty bad if you're trying to figure out identity), many programs were abandoned. In the years since then, both private companies and university research labs funded with government grants have made vast improvements in facial recognition and iris scans, like 3-D face capture and "skinprint" technology (mapping of facial skin patterns). Iris scans can allegedly tell identical twins apart.
Many private companies shill these products directly to local law enforcement agencies, a business strategy that police tend to be pretty enthusiastic about. One such success story is the MORIS device, a gadget attached to an iPhone that can run face recognition software, take digital fingerprints and grab an iris scan at a traffic stop. Starting last fall, the MORIS device has been in use in police departments all over the country.
4. Government databases.
Privacy advocates point out that novel types of biometric technology like facial recognition and iris scans can be an unreliable form of ID in the field, but that has not discouraged government agencies from embarking on grand plans to hugely expand their biometric databases. The FBI's billion-dollar "Next Generation Identification" system (NGI) will house iris scans, palm prints, measures of voice and gait, records of tattoos, and scars and photos searchable with facial recognition technology when it's complete in 2014. The bulk of this information is expected to come from local law enforcement.
Other government agencies have also been jazzing up their biometric databases. The DoD's ABIS contains millions of biometric records from Iraq and Afghanistan, including images of faces, fingerprints, iris scans, and voice recordings. DHS's IDENT database houses photos searchable with facial recognition. The DoJ, DHS and DoD have a mandate to make their databases "interoperational" so anyone from any agency can search the others.
The development of the new FBI database is subject to internal review to ensure compliance with privacy laws. But privacy advocates point out that there are few bulwarks against abuses in how the information is collected and used, especially since much of this information can be picked up remotely without consent.
5. FAST (Future Attribute Screening Technology).
Then there's the tech that's supposed to peer inside your head. In 2008, the Department of Homeland security lab tested a program called Future Attribute Screening Technology (FAST), designed to thwart criminal activity by predicting "mal-intent." Unsavory plans are supposed to reveal themselves through physiological tells like heart rate, pheromones, electrodermal activity, and respiratory measurements, according to a 2008 privacy impact assessment.
The 2008 privacy assessment, though, only addressed the initial laboratory testing of FAST's prophesying sensors on volunteers. According to a report in the journal Nature, sometime last year DHS also tested the technology in a large, undisclosed area in the northeastern US.