While Apple has, temporarily at least, backed away from last year’s plan to run client-side scanning (CSS) software on customers’ iPhones to detect and report child sexual abuse material (CSAM) to authorities, European officials in May proposed rules to protect children that involve the same highly criticized approach.
The European Commission has suggested several ways to deal with child abuse imagery, including scanning online private communication and breaking encryption. It has done so undeterred by a paper penned last October by 14 prominent computer scientists and security experts dismissing CSS as a source of serious security and privacy risks.
In response, a trio of academics aims to convey just how ineffective and rights-violating CSS would be to those who missed the memo the first time around. And the last time, and the time before that.
In an ArXiv paper titled “YASM (Yet Another Surveillance Mechanism),” Kaspar Rosager Ludvigsen and Shishir Nagaraja, of the University of Strathclyde, and Angela Daly, of the Leverhulme Research Center for Forensic Science and Dundee Law School, in Scotland, revisit CSS as a way to ferret out CSAM and conclude the technology is both ineffective and unjustified.
Client-side scanning in this context involves running software on people’s devices to identify unlawful images – generally those related to the exploitation of children but EU lawmakers have also discussed using CSS to flag content related to terrorism and organized crime.
Apple’s approach involved using its NeuralHash machine-learning model to compute an identifier for images set to be synced to iCloud against a list of known CSAM identifiers. And it didn’t fare all that well when security researchers found they could create hash collisions with non-CSAM images. European officials haven’t settled on a specific technical approach, but as far as the paper’s authors are concerned, CSS isn’t fit for the task.
Ludvigsen, Nagaraja, and Daly argue that CSS can no more prevent the distribution of CSAM than antivirus scanning can prevent the distribution of malware.
Even if you assume, they argue, that a CSS system caught all CSAM it encountered – an unrealistic assumption – there’s no clear definition of CSAM. There’s a legal definition, they say, but this cannot be translated into rules for a CSS system.
So adversaries will respond to CSAM scanning by finding ways to craft images that evade detection.
“CSS contains in its very notion constant surveillance upon the system, and unlike pure logging, attempts to oversee all events within a given framework,” the boffins explain. “This makes it very similar to software like antivirus, which we cannot be ‘perfect’ as the definition of malicious software can never define all the types in existence.”
What’s more, the researchers claim the cost of trying to solve CSAM far outweighs the benefits, and that’s likely to be the case regardless of how the technology evolves. Presumably there would be some benefit to finding CSAM images loaded onto phones by child exploiters unaware that their devices now surveil for the state, but these would be overshadowed by violating other people’s privacy rights all the time and denying everyone the benefits of encryption.
“Surveillance systems are well known to violate rights, but CSS present systems which will do this routinely or constantly, which is why we find them to be dangerous and cannot justify [them] by the goals they aim to serve,” the computer scientists argue.
They are, however, convinced that EU legislators will attempt to move forward with some sort of CSAM scanning scheme, so they’ve also attempted to explain the legal problems they expect will follow.
“We find that CSS systems will violate several rights within the European Convention of Human Rights, but our analysis is not exhaustive,” the researchers state in their paper. “They will likely violate the Right to a Fair Trial, in particular the Right to Remain Silent and Not Incriminate Oneself, Right to Privacy, and if implemented further than current examples, Freedom of Assembly and Association as well.”
For example, a trial cannot be fair, the researchers argue, if defendants cannot easily challenge evidence produced by an undisclosed algorithm. There’s always the possibility that the imagery may have been planted by authorities or fabricated or downloaded as a result of entrapment.
The authors go on to chide the European Commission for the techno-solutionist belief that CSS is the only possible way to combat CSAM. The Commission, they say, “disregards and does not analyze the potential consequences either CSS or server-side scanning would have on cybersecurity and privacy, while they justify the victim’s potential positive outcomes outweighing the negative of everyone else.”
The researchers conclude that CSS is just too disruptive.
“If you want to dig for gold, you predict accurately where it is,” they say. “What you usually do not do, is to dig up the entire crust of the surface of the earth. CSS systems and mass surveillance represent the latter.” ®