British police are increasingly adopting artificial intelligence (AI), facial recognition and predictive policing technologies, often without promised public consultation, according to a new report from the London-based Royal Society of Arts (RSA).

(RSA), a 266-year-old London-based organization dedicated to solving social problems — past members include Benjamin Franklin, Adam Smith, Charles Dickens, Karl Marx, Stephen Hawking and Tim Berners-Lee, published “A Force For Good?” late last month. The report emerged from the RSA’s Forum For Ethical AI, which included a citizens’ jury that explored the use of artificial intelligence in decision-making.

UK police use of AI has been increasing for some time. South Wales Police have used it to catch and deter troublemakers at football matches between notoriously rowdy rivals Cardiff City and Swansea City. Durham police helped develop the Harm Assessment Risk Tool (HART), an AI system designed to predict the probability that a prisoner will re-offend. In a scene straight out of a Philip K. Dick novel, at least 53 UK local councils and 45 of the country’s police forces are now using computer algorithms to assess the likelihood a person will commit a crime. The “precrime” of yesterday’s science fiction has become today’s reality.

According to the report, various UK police forces, including Durham Constabulary, Surrey Police, West Yorkshire Police and Kent Police are all using crime mapping technology, including MapInfo, which it says “bears strong similarities to predictive policing systems.” The study found both operational and cultural concerns. Its authors lament that only a small minority of UK police forces would even confirm whether or not they were using AI and automated decision systems (ADS), including facial recognition alerts. Although a 2019 study by the Ada Lovelace Institute, an independent research organization, found that while 70 percent of the British public supports the use of such technology for criminal investigations, fully 55 percent of respondents want police use to be limited to specific circumstances.

The RSA researchers cite the potential for bias, “especially against historically marginalized groups in high-crime areas,” as an area of concern. They specifically note the “racial and gender biases” that have cast a dark cloud and many questions over the use of such technology in the United States. The authors also warned against “adopting new technologies without adequate cultural safeguards, especially around deliberation and transparency.”

The report noted that the research was conducted before the coronavirus pandemic, and that “the playing field for law enforcement has
shifted dramatically since then.”

“The police have extensive new powers, deemed as necessary to limit one of the greatest public health crises in modern history,” the report states. “Policies may have been altered in this time, but we believe that adequate scrutiny, in particular of the use of new technologies, will be all the more necessary as we acclimatize to increasing police powers.”

The report’s authors caution against over-reliance upon AI. “One of our overriding concerns comes from the use of artificial intelligence as a means of increasing the efficiency of policing rather than the quality of it,” they wrote. “New technologies must be used responsibly, and for the
purposes of improving police work rather than simply as a cost-cutting
measure.”

“Artificial intelligence systems allow for cost-saving which decreases the availability of less measurable benefits of policing, such as relationship and community-building,” the report states. “There is also a danger that efficiency gains may be misleading or can produce unintended consequences.”

“New technologies will be only be used effectively and responsibly by keeping a ‘human in the loop,'” the authors conclude. “AI must have real human oversight from beginning to end, providing continuous feedback and modification.”

(Photo: Metropolitan Police/Twitter)

Please subscribe to receive the latest newsletter!
We respect your privacy.