As police in Durham prepare to go live with an artificial intelligence (AI) system this morning, it potentially heralds another milestone for modern policing – where technology and policing work together in the pre-arrest and post-arrest process.
The new system will classify suspects at a low, medium or high risk of offending or reoffending. To date, it has been trained on five years worth of offending histories data, and has been trialled by the Durham police.
The system was tested in 2013, the Harm Assessment Risk Tool (Hart) took data from the Durham police records between 2008 and 2012.
>See also: The role of artificial intelligence in cyber security
It was evident the AI-based system worked, with forecasts that a suspect was low risk turned out to be accurate 98% of the time, while forecasts that they were high risk were accurate 88% of the time.
At the time, however, these results did not impact custody decisions, according to Sheena Urwin, head of criminal justice at Durham Constabulary.
The results reflect the system’s predisposition. It is designed to be more likely to classify someone as medium or high risk, in order to err on the side of caution.
This method of determination is contentious. One expert agreed that the tool could be useful, but warned that the risk of the system skewing decisions must be carefully monitored.
Inevitable
“I imagine in the next two to three months we’ll probably make it a live tool to support officers’ decision making,” Urwin told the BBC.
Urwin explained that suspects with no prior convictions would be less likely classified as high risk. However, this is in part determined on the crime. If it was murder, for example, Hart would factor that into its output.
>See also: Policing cybercrime: a national threat
Paul Cant, vice president EMEA, BMC Software said: “It should come as no surprise that the use of AI is starting to filter into public services – all businesses across all industries have to adapt to thrive in the digital era.”
However, there are bias concerns. “To some extent, what learning models do is bring out into the foreground hidden and tacit assumptions that have been made all along by human beings,” warned Professor Cary Coglianese, a political scientist at the University of Pennsylvania who has studied algorithmic decision-making.
“These are very tricky [machine learning] models to try and assess the degree to which they are truly discriminatory.”
Limitations
Hart, and the technology behind it, has the potential to improve police operations. There is no doubt. However, according to Urwin, there are current limitations to Hart.
>See also: Top 10 predictions for low-level cybercrime in 2017
At the moment, the system only uses information from within Durham’s records, not the national database. If someone from outside the county’s jurisdiction was arrested in Durham, prior convictions would not be made available.
“That’s a problem,” said Helen Ryan, head of law at the University of Winchester. She did add, however, that “Even without this system, [access to sufficient data is] a problem for the police.”
Moving forward, Ryan said Hart had the potential to benefit police forces following extensive trials. “I think it’s actually a very positive development,” she added. “I think, potentially, machines can be far more accurate – given the right data – than humans.”
Nominations are now open for the Tech Leaders Awards 2017, the UK’s flagship celebration of the business, IT and digital leaders driving disruptive innovation and demonstrating value from the application of technology in businesses and organisations. Nominating is free and simply: just click here to enter. Good luck!