Cops Are Already Trying to Use Computers to Predict Crime -- It Ain't Gonna Work
Photo Credit: shutterstock.com
A small story popped up in the news this November -- "A unique collaboration between a University of California, Riverside sociologist and the Indio Police Department has produced a computer model that predicts, by census block group, where burglaries are likely to occur. ... The result is an 8 percent decline in thefts in the first nine months of 2013." The Indio police chief called the project the "wave of the future."
And by all appearances, it does appear to be on the menu for Federal law enforcement. The National Security Agency and its digital dragnets like PRISM -- one of the big Snowden leaks -- aren't just about immediate surveillance of criminal activity. That's only a limited use of the potential of a technology that creates profiles of a population, records all their significant behavior, communications and who their friends are. A recent report from the FBI's Behavioral Threat Assessment Center (BTAC) on pre-planned massacres like the one at Virginia Tech discusses the benefits of "assessments of the risk of future violence while enabling the development and implementation of dynamic enabling the development and implementation of dynamic behavioral strategies to disrupt planned attacks." In other words, stopping pre-crime.
But will it work?
Human data, like juridical definitions, are much too fluid to map and control. Nevermind the huge number of failures by law enforcement agencies to follow obvious leads using their own intelligence on the Boston Marathon bombing suspects -- suggesting we might have a technocracy that's already too sprawling to succeed.
"The problem with using big data analysis to predict the future is that it bases its understanding of the future entirely on the past," explained Douglas Rushkoff, media theorist and author of Present Shock. "It uses the vast quantity of data about what has already happened to calculate the probabilities of those things happening again. It looks at what people have seen, thought, and done before to figure out how those trends might continue into the future. But it has no way of contending with novelty. People, as much as we'd like to forget this part, are actually alive. Being alive, being creatures with free will, means we engage in anomalous behaviors. We do weird things. We do not behave completely logically. While we approach many routine things—like buying tampons—in predictable ways, we are less and less predictable when it comes to bigger, stranger decisions."
Predicting the behavior of purposefully "potential" terrorists by skimming the global datascape will inevitably prove a ludicrous proposition. But it clears the launch pad for bonanza payouts to contractors and bureaucrats, as more and more technocrats are employed to parse more and more data leading nowhere.
"The choice to blow up a bridge, for example, is already really novel," said Rushkoff. "It is more apt to be relegated to the decision centers in our brains that aren't working in a routine manner. Which brings me to the other big problem with attempting to use Big Data to identify terrorists: The sample size is really small. A whole lot more people buy tampons in a day than crash planes into a building. So the sample size for consumers of products is much much bigger than the sample size for terrorists. There just aren't enough terrorists for statisticians to apply factor analysis and other big data predictive modeling techniques."
"The reason we put these people in power over us is to restrict ourselves," said Alan Moore, author of Watchmen and V For Vendetta, whose Guy Fawkes mask has been worn by Occupy, Anonymous and other populist deprogrammers. "We are frightened by our possibilities. We don't feel comfortable about being responsible for ourselves and for the societies in which we live. We would much rather delegate that to appointed figures who will inevitably abuse their positions, but we know that going in."
"If as you say there is that realization dawning throughout culture at the moment, then that can only be a good thing," Moore added. "The closer we get to reality, the better off we generally are."
"There's something quaint about these attempts to control us today, in a period where we are boiling with information and complexity, both of which have reached levels that could be called fractal, if that wasn't a polite way of saying chaotic," Moore told me. "I really don't think that the old tyrannies work anymore. That's a very liberating thing, although it is just a marker of how complex and alien our situation is rapidly becoming. We are reaching a boiling point, and what happens after that is unpredictable."
We have to face the reality that We The People won't be able to counter the surveillance state politically until we see evidence of ourselves within it...not just leaks abut capabilities, but leaks that show the Govt. knows what we're up to that we don't want known. "It's true we're devoting more brain to active RAM than hard drive, if I might use that metaphor," said Rushkoff. "We relegate memory to the web, and thus we forget. When 9/11 happened, we seemed to be surprised to learn that someone tried to blow up the United Nations just a few years earlier. Or the Holland Tunnel. The real problem is that we're so distracted by the onslaught of communications and data that we can't think. People are too busy responding to consider the implications."
Rushkoff, who has been writing about big data surveillance for years, said,"Everyone who works in the industry knows full well what's been going on. It is remarkable that this particular episode of documentation has become such a big deal. We are at the mercy of a system that is desperately attempting to maintain itself. So the way to keep humans from getting the plot is to disorient us. It's not Cheney doing it; it's the machine itself. That's what made Snowden a hero. He stood up as a human against the machine."