The New York City crime rate famously plummeted in the mid-1990s under the watch of police chief William Bratton, who introduced a computerized mapping system called CompStat to help cops track crime hot spots. He later took the system to Los Angeles, where once again crime plunged. CompStat is now used nationwide, reports Miller-McCune (July-Aug. 2011), and Bratton is a law-enforcement superhero.
Bratton’s latest innovation, called predictive policing, represents the next step in computerized law enforcement–moving beyond charting past felonies to forecasting future offenses. “The only way for us to continue to have crime reduction is to start anticipating where crime is going to occur,” says L.A. police lieutenant Sean Malinowski, with whom Bratton conceived the sophisticated data analysis program. The men unveiled their plan three years ago, and the U.S. Department of Justice funded their continued research with a $200,000 grant. Not all policy makers–or even cops–share enthusiasm for predictive policing, however. “Hardheaded street cops are understandably skeptical,” says Miller-McCune. “And civil libertarians are concerned that it could result in extra police pressure on poor and minority neighborhoods.”
Regardless, predictive policing has moved on to the testing stage, at least on a small scale. Minneapolis placed police near a street where they predicted storefront crime and successfully caught felons shortly after an armed robbery. Arlington, Texas, used predictive policing when deciding to place police in neighborhoods that have deteriorating houses–and a higher rate of home burglaries. If the analyses seem simplistic, Miller-McCune notes that “cross-pollinating data can yield unexpected results [and] counterintuitive conclusions,” such as the observation that, because muggers can’t work in the dark, more street lights actually boost rather than reduce crime.
Have something to say? Send a letter to editor@utne.com. This article first appeared in the November-December 2011 issue of Utne Reader.