The American prison system is so rife with inequality and the undue influence of private money, it can be easy to forget that the priority of a corrections system is to actually adjust behavior and prevent people from coming back.
In that regard, the business of keeping people out of jail has a long way to go. Studies by the Bureau of Justice Statistics show that more than half of all inmates are arrested within a year of getting out; that number grows to 76.6% by five years into their release.
Silicon Valley is one powerful group looking for a solution. Tech giants think they can get those numbers down, not years later or even on the day of release, but by predicting who will reoffend before they even leave.
In a barely noticed online presentation uploaded to YouTube, Microsoft program manager Jeff King showed how the company's algorithms can look at inmate data and predict who will commit a crime within six months of release. Using a mock-up dataset actually meant to train machine intelligence, Microsoft was able to successfully predict recidivism for 91% of inmates.
The system is incredibly simple to use. In one end of the algorithm, you plug into an Excel spreadsheet how many months a person's incarceration lasted, the inmate's gang affiliation, what kinds of rehab programs the inmate participated in, and the like. On the other side, you get a prediction for whether he or she will end up in prison again soon.
"Once they were pushed through the corrections system and they went through through rehabilitation programs, we're then able to take a look at what rehab programs were most effective at either rehabilitating them, or having them reoffend almost immediately after release," King said during the presentation.
King is part of the small JPS team at Microsoft that helps police departments with their cloud storage and services. Ninety percent of the time, King is helping police figure out how to store all of the body-cam footage they've been collecting. But now, the JPS team is trying to help police put their data to good use. It's not likely the Microsoft algorithm will become its own program. Instead, King said, the machine learning could be plugged into existing software systems like Tribridge, which keeps track of and manages inmate data.
"Imagine if you could open a profile on an inmate, and there's a little box that gives you the likelihood that the inmate could reoffend," King told Mic. "We're trying to get police agencies to understanding what data can do for them. Most aren't even close."
The Chicago 400: Microsoft's program is just proof of concept, but in Chicago, this kind of algorithm has already been put to work. Chicago built a "heat list" of roughly 400 people at risk for repeated criminal offenses and built them a "custom notification" system that alerts those people to social services and means of personally preventing their recidivism. Then, they intervene with a simple "notification." Chicago police told Mic that this kind of subject-based prediction, as opposed to traditional crime mapping, is one of the "most exciting things in police technology."
Even with such a light touch, it's been criticized as opaque and potentially prejudicial, largely because there are no holistic standards for governing how police departments act on subject-based predictions.
Body-worn cameras face a similar challenge. Scorecards and surveys have showed that the vast majority of police departments lack robust rules for how body-worn cameras should be used, opening up huge gray areas where abuses of power can occur. But Chicago police defend the practice, reminding Mic that everyone on the list is at a high risk for a repeat offense, and that they have a responsibility to help people who are at risk.
"Most approaches are reactive: When someone calls 911, something's already happened," Deputy Chief Jonathan Lewin, chief technology officer of the Chicago Police Department, told Mic. "We know people on that list are at risk of being shot at or killed. So having that knowledge, I think it's incumbent to do something to try to save lives."
"The solution seems to be, in a big data world, we want to somehow sanctify the notion of the human volition — of human free will — and to preserve that as a central attribute."
More problems may arise after a person has been tagged as a potential risk. Kenneth Cukier, data editor for the Economist, told Big Think that once we determine someone is a potential criminal before they've done anything wrong, there's potential to stigmatize these people, or treat them with the expectation that they'll definitely commit a crime, essentially robbing them of the idea that they have free will.
"The solution seems to be, in a big-data world, we want to somehow sanctify the notion of the human volition — of human free will — and to preserve that as a central attribute," Cukier said.
For now, the emerging industry of predicting inmate recidivism is in its early stages, being toyed with by companies, marketed to a handful of police departments and implemented from afar in one notable, politically troubled city.
And as long as the machines that decide who will and will not become a criminal are being put to use keeping people out of jail — as opposed to proactively search, surveil, police or arrest — departments are on safe ground.