A UK minister for policing has called for forces to double their use of algorithmic-assisted facial recognition in a bid to snare more criminals.
Chris Philp MP, Minister of State for Crime, Policing and Fire, said both the use of live and retrospective facial recognition should increase following a commitment to spend £17.5 million ($21.3 million) on “a resilient and highly accurate system” to search all databases of images the police can access.
Retrospective facial recognition (RFR) relies on an image taken at a crime scene – using CCTV, police cameras, or phone footage – to scan police databases to find a match. In live facial recognition (LFR), real-time footage from events is checked against a pre-defined target list of known criminals or suspects.
“Algorithms have advanced hugely in recent months and even blurred, or partially obscured images can now be successfully matched against custody images, leading to arrests,” Philp said in an open letter. “Searching the whole Police National Database (PND) image set rather than just local force ones will maximise the chance of a match, and I encourage routine use of RFR across the entire range of crimes.”
He claimed there were examples of RFR helping police to identify people suspected of murder, sex offences, domestic burglary, assault, car theft, and shoplifting, where identification “might otherwise have been impossible or taken much longer.”
Philp also said LFR would deter and detect crime in public settings that attract large crowds. There is College of Policing Authorised Professional Practice in place and a “sound legal basis for LFR,” he claimed.
London’s Metropolitan Police “recently used LFR at an Arsenal v Tottenham game where it led to the arrest of three people: one charged with being in breach of a football banning order, one wanted on recall to prison for sexual offences, and one who admitted using threatening and abusive words and being in breach of a court order,” he said.
The National Physical Laboratory provided the “necessary assurance” about accuracy and the absence of gender or racial bias in the algorithms where LFR had been used, Philp said, while the immediate deletion of non-matched biometric data “addresses privacy concerns.”
However, speaking to Parliament’s Science, Innovation and Technology Committee in May, Dr Tony Mansfield, principal research scientist at the National Physical Laboratory, said the system used by the Met, the UK’s largest police force, was prone to bias against Black individuals on a set of test data created for his investigations.
“We find that if the system is run at low thresholds and easy thresholds, it does start showing a bias against the Black males and females combined,” he told MPs.
Mansfield added that he believed the Met did not operate the system at these thresholds.
In 2017, the Met was urged to cancel its planned use of facial recognition software at Notting Hill Carnival, Europe’s largest street festival. Privacy groups including Big Brother Watch, Liberty, and Privacy International called for a U-turn on the use of the tech.
In 2018, Big Brother Watch found that 91 percent of people flagged up on the Met’s facial recognition system were not on the watch list. At the time, a Met spokesperson said the force did not consider these false positives “because additional checks and balances are in place to confirm identification following system alerts.” ®