London’s Metropolitan Police Service (MPS) has survived a legal challenge that attempted to curb its rollout of live facial recognition (LFR) technology across the capital.
The challenge was brought against the Met by civil liberties organization Big Brother Watch, which was representing Shaun Thompson, an anti-knife crime campaigner and youth worker who was falsely identified as a criminal suspect by LFR cameras in Croydon.
Big Brother Watch supported Thompson’s case, which argued that the technology violated his rights to privacy under articles 8, 10, and 11 of the European Convention on Human Rights (ECHR).
The UK’s High Court concluded this week that LFR technology itself does not violate any of the ECHR’s aforementioned articles, and found that Thompson’s personal rights to privacy were not infringed.
Presiding over the case, Lord Justice Holgate and Mrs Justice Farbey considered both the UK’s incorporation of the ECHR into domestic law (the Human Rights Act 1998) and Strasbourg’s, but found that the Met’s LFR policy satisfied the requirement for being in accordance with and prescribed by the law.
In short, the justices found the Met’s planned use of LFR is legal and does not violate the human rights of Britons who are subjected to it.
Sir Mark Rowley, commissioner of the MPS, described the judgment [PDF] as “a significant and important victory for public safety.”
“The courts have confirmed our approach is lawful. The public supports its use. It works. And it helps us keep Londoners safe. The question is no longer whether we should use Live Facial Recognition – it’s why we would choose not to.
“Technology is advancing at record speed, and policing cannot afford to stand still – criminals won’t. Facial recognition is transformational for policing. Government and Parliament will want to carefully consider how they continue to enable, rather than over‑regulate, the use of technologies that help us prevent crime and protect the public as proven today.”
Silkie Carlo, director at Big Brother Watch, labeled the High Court’s judgment “disappointing.” As for Thompson, he plans to appeal the decision.
“I’ve considered the court’s judgment today and decided to appeal it to protect Londoners from facial recognition being used for mass surveillance and leading to situations like mine, where I was misidentified, detained, and threatened with arrest,” Thompson said.
“No one should be treated like a criminal due to a computer error.
“I was compliant with the police, but my bank cards and passport weren’t enough to convince the police the facial recognition tech was wrong. It’s like stop and search on steroids. It’s clear the more widely this is used, the more innocent people like me risk being criminalized.”
A hot topic
Police use of LFR in the UK is a fiercely debated topic. Law enforcement officials insist it is an invaluable tool to protect public safety, while privacy proponents argue it represents a severe surveillance overstep.
The Met, meanwhile, claims the tech has led to 2,100+ arrests since 2024, saying a quarter of these (24 percent) were related to violent crimes against women and girls. It also claims more than 100 sex offenders were arrested off the back of LFR, and the identifications potentially prevented many more sex attacks against vulnerable children.
In their unwavering support for the technology, police forces often spout the results from independent safety tests to which LFR systems are subjected before they are deployed.
The National Physical Laboratory carries out these assessments, and as The Register previously reported, the Met likes to frame the results in positive ways.
However, despite the Met claiming the technology is consistently performant across demographic groups, the false positive rates for Black people, including Thompson, are considerably higher than for any other group, and have been throughout various tests since at least 2020.
In the police’s most recent annual review, it claimed low false positive rates of 0.0003 percent across a total of 3,147,436 faces it scanned across all deployments. But if you look at it in terms of the number of alerts LFR cameras specifically made (2,077), it rises to 0.48 percent. And of the false positives, 80 percent of them were made on Black people.
“Overall, the system’s performance remains in line with expectations, and any demographic imbalances observed are not statistically significant,” the report stated. “This will remain under careful review.”
The UK government is approving wider deployments of LFR-equipped vans and permanent deployments despite the flaws, which in some cases are so significant that they are still preventing police forces from rolling them out. ®