(Exclusive) Crime-prediction tool PredPol amplifies racially biased policing, study shows

Impact

Algorithms have taken hold over our lives whether we appreciate it or not. When Facebook delivers us clickbait and conspiracy theories, it's an algorithm deciding what you're interested in. When Uber ratchets up rush-hour prices, it's the service's algorithm kicking in to maximize profits. When ads for shoes you can't afford follow you around the internet until you give in, it's an algorithm tracking your course.

Algorithms are also taking over policing. In cities like Los Angeles, Atlanta and Philadelphia, "predictive policing" algorithms comb through past crime data to tell officers which people and places are most at risk for future crimes. The most popular is PredPol, an algorithm developed by the Los Angeles Police Department in collaboration with local universities that takes in hard data about where and when crimes happened and then makes a "hotspot" map of where crime will likely happen next.

"These models are supposed to give you some unseen insight into where crime is supposed to be, but it's just common-sense stuff."

But according to a study to be published later this month in the academic journal Significance, PredPol may merely be reinforcing bad police habits. When researchers from the Human Rights Data Analysis Group — a nonprofit dedicated to using science to analyze human-rights violations around the world — applied the tool to crime data in Oakland, the algorithm recommended that police deploy officers to neighborhoods with mostly black residents. As it happens, police in Oakland were already sending officers into these areas.

"These models are supposed to give you some unseen insight into where crime is supposed to be," William Isaac, one of the report's co-authors, said in an interview. "But it's just common-sense stuff, and we make a case that these software suites are basically used as a tool to validate police decisions."

Using a publicly-available version of PredPol's algorithm, researchers Isaac and Kristian Lum used 2010 reported crime data from Oakland to predict where crimes would occur in 2011. To compare that map with what's actually going down in Oakland, researchers used data from the Census and the National Crime Victimization Survey to create a heat map showing where drug use in the city was most prevalent in 2011.

Human Rights Data Analysis Group/Mic

In an ideal world, the maps would be similar. But in fact, PredPol directed police to black neighborhoods like West Oakland and International Boulevard instead of zeroing in on where drug crime actually occurred. Predominantly white neighborhoods like Rockridge and Piedmont got a pass, even though white people use illicit drugs at higher rates than minorities.

To see how actual police practices in Oakland matched up with PredPol's recommendations, researchers also compared PredPol's map to a map of where Oakland Police arrested people for drug crimes. The maps were strikingly similar. Regardless of where crime is happening, predominantly black neighborhoods have about 200 times more drug arrests than other Oakland neighborhoods. In other words, police in Oakland are already doing what PredPol's map suggested — over-policing black neighborhoods — rather than zeroing in on where drug crime is happening. 

"If you were to look at the data and where they're finding drug crime, it's not the same thing as where the drug crime actually is," Lum said in an interview. "Drug crime is everywhere, but police only find it where they're looking."

To be clear, Oakland does not currently use PredPol — researchers merely used Oakland as an example of what happens when you apply PredPol to a major metropolitan area. Dozens of other U.S. cities, however, do. It is a staple of policing in Los Angeles, which has the second-largest department in the country after New York City. Across the nation, PredPol is deciding what neighborhoods and city blocks officers prioritize when they make their rounds.

Mic

PredPol CEO Brian MacDonald reached out to clarify that PredPol isn't used for drug crime, but instead looks at reports for crimes like assault, robbery and auto theft, precisely to avoid the kind of bias the HRDAG discovered.

"The reason we do not predict for drug crimes is that these can be selectively enforced in different neighborhoods or by different officers," MacDonald said. "Our practice is to use the most objective data available, and 'drug crime' data does meet our criteria of inclusion."

Because PredPol's algorithm uses reported crime and arrests to generate a heat map — as opposed to where crime actually occurs — its recommendations can become a self-fulfilling prophecy. When officers are dispatched to neighborhoods where police already make a lot of arrests, they make even more, creating a feedback loop. 

In a second experiment, Isaac and Lum hypothesized that sending police to neighborhoods chosen by the algorithm would lead to a jump in reported crime by 20%. The researchers fed that 20% increase in arrests back into the algorithm. The algorithm became orders of magnitude more confident that its predictions were correct.

"If police go there and find more crime, it creates a feedback loop, and the algorithm becomes more certain about these places that are over-policed," Lum said.

Predictive policing is still an exciting tool for departments under pressure from their city hall to modernize the police force. But many departments are giving up on crime mapping entirely for precisely the reason Isaac and Lum discovered in the course of their research: The new wave of predictive policing programs end up telling police what they already know. One criminologist Mic spoke with last year referred to it as "old wine in new bottles."

Police in Richmond, California, decided not to renew their three-year PredPol contract after they couldn't find evidence it was working, citing double-digit increases in crime. In Burbank, police stopped relying on PredPol after a department-wide survey found that 75% of all officers had "low or extremely low" morale, in part due to new predictive policing directives.

"Officers on the street were told that this predictive model will tell you where to go and affect our crime rates positively — just do it," Sergeant Claudio Losacco of the Burbank Police Department said in a phone interview. "It's like telling a fisherman of 20 years that we're going to tell you how to fish."

To evaluate the fairness and efficacy of predictive crime algorithms, they would need to be audited by outside parties. But most predictive police technology exists in a black box of private sector trade secrets; systems that should be up for public scrutiny are outsourced to private companies like PredPol that don't have to disclose their algorithms for a public audit. The only way researchers were able to use the software in this case was to pull a version of the algorithm from one of PredPol's own, published studies.

"If predictive policing means some individuals are going to have more police involvement in their life, there needs to be a minimum of transparency," Adam Schwartz, a senior staff attorney with the Electronic Frontier Foundation, said in an interview "Until they do that, the public should have no confidence that the inputs and algorithms are a sound basis to predict anything."

Justin Sullivan/Getty Images

Schwartz pointed out that in some states, such as Illinois, there are legal prohibitions on adopting systems that have a racially-disparate impact. Without being able to evaluate predictive policing systems, and strong laws in place to prevent police technology from amplifying the worst biases in police work, he says predictive policing isn't ready for actual police use.

"What we want for police to do is not to be putting in place new systems of predictive policing until a lot more study is done and safeguards are put in place," Schwartz said. "Frequently these systems shouldn't be adopted at all."

As for Oakland, they may not have PredPol yet, but Mayor Libby Schaaf has repeatedly sought over $150,000 to purchase PredPol for the Oakland Police Department. Malkia Cyril, executive director of the Oakland-based Center for Media Justice, says Oakland's legislators don't care that PredPol hasn't proven to be effective.

"Predictive policing is clearly not a solution, and it'll transfer existing bias and existing iniquities in the current policing system into a predictive approach," Cyril said. "It's not technology that makes the place a city more efficient and a better place to live. For us, it'll make the city unlivable."

Correction: Oct. 12, 2016