Why Predicting Crimes Is Almost Always Racially Biased

As police work becomes more futuristic, it appears it is doomed to be stuck in the past. The concept of predictive policing might sound good in theory, but it could be dangerous. The idea is that if police could figure out where crime is most likely to happen, they might be able to prevent it or catch a person in the act. However, the actual result appears to be an oppressive reinforcement of age-old racial discrimination.


A report released in August by the right-wing Rand Corporation (which makes it all the more striking) looked at Chicago's predictive policing program that started in 2013. The program is supposed to use data from people with criminal records and identify who was at the highest risk of being shot. Once police had the names, they were supposed to help guide the person to a safer lifestyle. The study found officers who intervened were more likely to arrest the person than help them, and the study concluded there were serious constitutional questions around police using tools like this. Black neighborhoods were targeted the most.

A study that will soon be published in the journal Significance looked at how the predictive policing algorithm PredPol, which is one of the most popular tools, would analyze the city of Oakland. The study found the algorithm specifically targeted black neighborhoods when it analyzed crime data and decided where officers should be deployed, according to Mic. The study found that drug use is common in areas all around Oakland, but arrests for drug use are concentrated in certain areas, specifically black neighborhoods.

The Burbank police department in California recently decided to stop using technology that attempts to predict where crimes will occur based on crime data because doing so wasn't as effective as just letting officers use their own knowledge of their areas. The data produced absurd results, like sending officers to the police station because many crimes are reported there.

Police departments across the nation are using these predictive policing tools to determine where to send their patrol cars, but the possiblity that this technology is furthering racial discrimination has not been adequately addressed. 

“It's not the algorithms so much, I think it's the data that's fed into them,” Ezekiel Edwards, director of the Criminal Law Reform Project at the ACLU, told AlterNet. “It's well known that criminal justice data, particularly crime data, is notoriously suspect.”

Information on where crime occurs is dependent on when and where crime is reported. But, “Since lots of crime goes unreported, it's very hard to make accurate predictions about future crime based on the limited amount of what we know,” Edwards said. Furthermore, how laws are enforced and which laws are enforced and where they are enforced can change from one instance to another.

We know things like domestic abuse, theft and drug use occur in wealthy, often white neighborhoods, but police may not get involved in those areas. Because of this, crime data will only focus on neighborhoods where police are already focusing more of their enforcement.

“If you use data collected by police departments to predict risks of drug use, you are likely going to find that neighborhoods predominantly occupied by people of color or low socioeconomic folks are at risk for drugs,” Alethea Lange, a senior policy analyst for the Center for Democracy and Technology, told AlterNet. “However, we know that people abuse drugs at about the same rate, regardless of [socioeconomic status] or race.”

A white guy smoking a joint outside his house in Beverly Hills is much less likely to end up in cuffs than a black guy smoking a joint outside his apartment in Compton. Not only do algorithms use racially biased data, the companies that make predictive policing tools often don't want to release information on what kind of data is being used, for proprietary reasons, which makes it hard to analyze how biased the devices may be. Transparency is direly needed.

“It would be very difficult to create a predictive policing algorithm that didn’t reflect the institutional racism of our society,” Lange said. “There are too many proxies for race to remove that data from the system and still end up with something useful.” Proxies for race could include someone’s zip code.

Predictive policing could also prevent police from being held accountable for profiling, as they're simply following instructions. “It not only perpetuates bias, but I think it kind of sanitizes racial bias,” Edwards said.

In recent years, America has focused on excessive use of force, the militarization of police and racial bias. With such serious problems in policing, using an flawed algorithm to identify where a crime may occur may cause police to react aggressively to what may be a small criminal act. Given how minor stops can escalate with sometimes tragic results (Eric Garner, Philando Castile and Walter Scott come to mind), perhaps sometimes it's best to let a crime go unpunished.

“Predictive policing is based on an unsupported premise—that crime can be accurately predicted,” Natasha Duarte, a fellow at the Center for Democracy and Technology, told AlterNet. “The research around crime theories and patterns is not robust and is controversial at best.” She said there is little evidence machine learning can predict crimes, and the constitutionality of using it is very questionable.

Apart from problems of reinforcing racially biased policing and the potential for overreaction, it's almost impossible to know if predictive policing tools are effective in the first place. “If they go to a place and a crime does occur, they can say it's because they predicted it, and if they go to a place and a crime doesn't occur, they can say it's because their presence thwarted it,” Edwards said.

It seems every time America tries to take a step forward, we find ourselves dragging the weight of institutional racism with us. Until laws are enforced fairly and those enforcing the laws are not biased, we will inevitably continue to fail at achieving justice. 

#story_page_post_article

Understand the importance of honest news ?

So do we.

The past year has been the most arduous of our lives. The Covid-19 pandemic continues to be catastrophic not only to our health - mental and physical - but also to the stability of millions of people. For all of us independent news organizations, it’s no exception.

We’ve covered everything thrown at us this past year and will continue to do so with your support. We’ve always understood the importance of calling out corruption, regardless of political affiliation.

We need your support in this difficult time. Every reader contribution, no matter the amount, makes a difference in allowing our newsroom to bring you the stories that matter, at a time when being informed is more important than ever. Invest with us.

Make a one-time contribution to Alternet All Access, or click here to become a subscriber. Thank you.

Click to donate by check.

DonateDonate by credit card
Donate by Paypal
{{ post.roar_specific_data.api_data.analytics }}

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.