Predictive Policing with Big Data

0 Min Read

Police Departments nationwide have been using data and statistics to drive policing since the 90s in an approach founded by the NYPD named 

Police Departments nationwide have been using data and statistics to drive policing since the 90s in an approach founded by the NYPD named CompStat was credited with dramatic reductions in crime and increases in efficiency. CompStat, a process and philosophy rather than a single technology or software, uses databases and GIS to record and track criminal and police activity and identify areas that are lagging or need more attention. While it provides much more information than “primal policing”, CompStat has advanced little beyond simple spreadsheets and mapping software. Inspired by recent innovations in Big Data and Apache Hadoop and businesses like Walmart or Amazon using analytics to determine future demand, departments across the country and worldwide are looking to take this approach to the next level and go from tracking crime to predicting it.

The first department to adopt this strategy was Santa Cruz through their city-wide 6 month Predictive Policing Experiment, named one to Time Magazine’s 50 best innovations of 2011. The large scope of the experiment, the statistically average crime rate and the challenges faced by the department make is a good example for other cities. Like most police departments right now, SCPD has a declining budget and shrinking police force. On top of that, in the first 6 months of 2011 it saw an unprecedented crime wave. Driven to do more with less, the department signed on to work with researchers at UCLA to test a new method of modelling crime.

UCLA mathematician George Mohler noticed that, over time, crime maps resemble other natural phenomenon and modified algorithms used to predict aftershocks to instead predict future property crimes from past data. Using seismologists’  models for crime isn’t as crazy as it sounds, since they’ve already been adopted in epidemiology and finance. Mohler’s approach is supported by popular modern theories on crime, the rational choice model, which states that criminals, like consumers, pick their targets rationally based on value, cost, effort, and risk, and the Broken Window theory, popularized by the same NYPD Commisioner who implemented CompStat, Bill Bratton. Though the Broken Window theory is typically applied to vandalism, the essence is that petty crime leads to major crime and that crime is self-reinforcing by setting norms and making areas seem poorly controlled. Past crimes can be predictive of future crimes because they indicate that an environment is target-rich, convenient to access for a criminal, vulnerable, or simply seems like a good place to strike due to a pattern of crime and poor control.

Mohler’s algorithm is different from the CompStat approach, which simply identifies “hot spots” where crime is clustered. To predict the most likely type, location,  and time of future crimes, Mohler must compare each past crime to the others and generate a massive amount of metadata. For the Santa Cruz Experiment, he went back to 2006, looking at roughly 5,000 crimes requiring 5,000! or 5,000 x 4,999 x 4,998… comparisons. When he compared his method to traditional CompStat maps for the LAPD’s archives, he found that it predicted 20 to 95 percent more crimes.

The experiment was recently concluded, and the department believes that its predictive policing program was a success. Despite having fewer officers on the force, SCPD reversed the crime wave and lowered crime by 11% from the first half of the year to 4% below historical averages for those months. Still, from that information alone, it’s difficult to tell how effective Mohler’s strategy was, and we will have a better indication when the LAPD concludes a similar but even larger study in May, that includes a control group.

Elsewhere, other departments in the United States and abroad are adopting a Big Data approach to policing as well. Sponsored by the Bureau of Justice Assistance , the Smart Policing Initiative is exploring predictive policing in over a dozen departments and agencies nationwide, including Boston PD, Las Vegas PD, and the Michigan State Police. Some have already yielded results, such as in Richmond, where software using a combination of business intelligence, predictive analysis, data mining, and GIS has contributed to a drastic drop in crime.  Predictive policing is also being tried in the UK where, in the single ward of the Greater Manchester area studied, burglary decreased by 26% versus 9% city-wide, prompting follow-up studies in Birmingham.

While predictive policing is showing promise and, in limited trials, results, the practice is still in its infancy with plenty of room to grow. Much more metadata can be generated and factors included into the predictive algorithms. For example, Santa Cruz could only predict property crime, as violent crime depends less on targets and opportunities and more on events and interpersonal interactions.  In business and counter-terrorism, however, tools like social network analysis and social media monitoring have been used successfully to get a better feel for social dynamics.  As predictive policing gets more attention and is adopted more widely, we can expect to see these and other Big Data solutions applied to law enforcement.

Share This Article
Exit mobile version