Cops Are Already Trying to Use Computers to Predict Crime -- It Ain't Gonna Work
Photo Credit: shutterstock.com
Stay up to date with the latest headlines via email.
A small story popped up in the news this November -- "A unique collaboration between a University of California, Riverside sociologist and the Indio Police Department has produced a computer model that predicts, by census block group, where burglaries are likely to occur. ... The result is an 8 percent decline in thefts in the first nine months of 2013." The Indio police chief called the project the "wave of the future."
And by all appearances, it does appear to be on the menu for Federal law enforcement. The National Security Agency and its digital dragnets like PRISM -- one of the big Snowden leaks -- aren't just about immediate surveillance of criminal activity. That's only a limited use of the potential of a technology that creates profiles of a population, records all their significant behavior, communications and who their friends are. A recent report from the FBI's Behavioral Threat Assessment Center (BTAC) on pre-planned massacres like the one at Virginia Tech discusses the benefits of "assessments of the risk of future violence while enabling the development and implementation of dynamic enabling the development and implementation of dynamic behavioral strategies to disrupt planned attacks." In other words, stopping pre-crime.
But will it work?
Human data, like juridical definitions, are much too fluid to map and control. Nevermind the huge number of failures by law enforcement agencies to follow obvious leads using their own intelligence on the Boston Marathon bombing suspects -- suggesting we might have a technocracy that's already too sprawling to succeed.
"The problem with using big data analysis to predict the future is that it bases its understanding of the future entirely on the past," explained Douglas Rushkoff, media theorist and author of Present Shock. "It uses the vast quantity of data about what has already happened to calculate the probabilities of those things happening again. It looks at what people have seen, thought, and done before to figure out how those trends might continue into the future. But it has no way of contending with novelty. People, as much as we'd like to forget this part, are actually alive. Being alive, being creatures with free will, means we engage in anomalous behaviors. We do weird things. We do not behave completely logically. While we approach many routine things—like buying tampons—in predictable ways, we are less and less predictable when it comes to bigger, stranger decisions."
Predicting the behavior of purposefully "potential" terrorists by skimming the global datascape will inevitably prove a ludicrous proposition. But it clears the launch pad for bonanza payouts to contractors and bureaucrats, as more and more technocrats are employed to parse more and more data leading nowhere.
"The choice to blow up a bridge, for example, is already really novel," said Rushkoff. "It is more apt to be relegated to the decision centers in our brains that aren't working in a routine manner. Which brings me to the other big problem with attempting to use Big Data to identify terrorists: The sample size is really small. A whole lot more people buy tampons in a day than crash planes into a building. So the sample size for consumers of products is much much bigger than the sample size for terrorists. There just aren't enough terrorists for statisticians to apply factor analysis and other big data predictive modeling techniques."
"The reason we put these people in power over us is to restrict ourselves," said Alan Moore, author of Watchmen and V For Vendetta, whose Guy Fawkes mask has been worn by Occupy, Anonymous and other populist deprogrammers. "We are frightened by our possibilities. We don't feel comfortable about being responsible for ourselves and for the societies in which we live. We would much rather delegate that to appointed figures who will inevitably abuse their positions, but we know that going in."