As UK ministers continue to quiz stakeholders over the effectiveness of the Online Safety Act, one charity chief raised concerns over the robustness of Ofcom’s enforcement of the controversial legislation.
Asked about how well the communications regulator has enforced penalties on organizations that violate the OSA, or fail to implement the required safeguards, Andy Burrows, CEO of the Molly Rose Foundation, said: “I do not get the impression that the companies are quaking in their boots at Ofcom’s enforcement approach.”
The Molly Rose Foundation was established in 2017 by the family of 14 year-old Molly Russell, who took her own life. Her father discovered she had viewed thousands of images online that promote suicide and self-harm.
Burrows went on to say that some of the enforcement action already taking place may be under supervision, but this is an opaque process and is therefore difficult to say whether it will be “sufficiently industrious.”
Baroness Kidron, founder of the 5Rights Foundation, agreed with Burrows, adding that if Ofcom doesn’t truly test its regulatory powers, then it will not be able to provide the right evidence to ministers if it ever wants part of the OSA amended.
“I would absolutely say in defense of Ofcom that the act is wrong in certain places and does leave certain gaps, and will need some more work,” Kidron said.
“The frustration is that, actually, where it is clear and where it is mandated, we don’t want to see [Ofcom] stroking [platforms] and saying, you know, ‘come on guys, do it, do it, do it.’
“We want to see them taking action, being robust, and I’m very sympathetic to Ofcom in anywhere where they feel they need a power and it has not been provided by Parliament.”
Burrows’ comments followed a question from the Communications and Digital Committee about the mandatory age assurance and whether Ofcom has learned any lessons from the rollout that could be applied if these rules are extended to other organizations, such as VPN providers.
Baroness Kidron focused on the regulator needing to ensure age assurance mechanisms are privacy-preserving, and not demanding more data than is necessary to verify a user’s age.
“I think that what I would now like to see is a recommitment on privacy, and I would also like to see Ofcom use its powers to say that where it is not privacy-preserving, age assurance has not met the bar of being highly effective because it’s not highly effective in a cultural sense, even if it actually determines whether you are 18 or not.”
She said that failing to privacy-protect these solutions is key to building the public’s trust in them – trust that quickly waned following the new rules coming into force in July.
Burrows highlighted the early days of the latest OSA measures coming into force, and how people quickly found crude workarounds for age verification measures, some of which included using video game avatars to pass as adults, as an example of why public trust has diminished.
“That would raise questions to me about whether that is a highly effective measure that is being deployed,” he said. “So, I would like to see Ofcom act quickly because public trust here is precious.
“Clearly, there do need to be privacy-preserving mechanisms, and we know that they exist. And I think enforcement is now the best way of being able to demonstrate to the public A) that this can be done, and B) that where we have seen high-profile examples which have generated public concern, that that is a reflection as to whether or not we have seen compliance rather than as it has been framed, whether or not this can be done.”
According to Lord Vaizey, speaking in the House of Lords on Monday, since mandating HEAA systems for in-scope platforms, Ofcom has begun investigating 47 websites and apps that are suspected of non-compliance.
Safe harbor
Another hotly debated topic was the safe harbor provision in the OSA – if in-scope platforms implement every measure Ofcom recommends, then they can rest assured they won’t be punished under the act, even if something goes wrong.
The idea is that it provides platforms with a sense of safety that, even if one of Ofcom’s recommendations is flawed, then whatever happens as a result of that failure can’t bite them until Ofcom changes its tune.
However, that safety net may come with a cost in that it incentivizes platforms against innovating and going beyond what’s required of them.
“The way that it’s worked in the Online Safety Act is that if you do Ofcom’s 44 measures, or whichever number it is now, then you’re fine,” said Kidron.
“Now you can choose to do something different, but if you do something different, you don’t have safe harbor, and that’s a problem because if what you could do is different, quicker, better, more advanced, more thoughtful, more nuanced, you could actually not have safe harbor for doing the better thing, and I think that’s why we’re talking about it as a negative incentive in this instance.”
Rani Govender speaks at a Communications and Digital Committee hearing alongside Andy Burrows and Baroness Kidron
The committee asked why is it that the safe harbor provision is removed when companies aim to do better. Rani Govender, policy and influencing manager at NSPCC, said that larger platforms are likely a step ahead of others in terms of being able to identify trends in online harms, because they have the power to collect more data than the regulator.
She pointed out that these platforms may have access to the information that could improve the act – by compelling all platforms to act against new harms – but there is little benefit to them for doing so.
“Now, if they’re spotting new trends, new ways that harms are developing on their platform, but there isn’t anything in the codes of practice that addresses that, then there is no obligation on them to address those harms,” Govender said.
“So, we’re thinking about how do we stay on top of emerging harms. Well, there has to be something that forces companies, once they’ve identified them, to immediately take action and look at what they could do to mitigate them, and at the minute there is not that incentive there.”
Evolving online harms
The resulting discussion also led to concerns being raised about Ofcom’s ability to keep pace with the latest online threats, and regulate accordingly.
Burrows acknowledged that Ofcom was doing a great job of understanding and articulating the risks related to matters such as child sexual abuse, and regulating these widely known issues.
However, he said there are newer harms that worry the Molly Rose Foundation “tremendously,” namely the threat posed by Com groups.
The National Crime Agency (NCA) issued an alert earlier this year warning of the dangers associated with com groups, composed largely of teenage boys.
Reports of these groups have increased sixfold between 2022 and 2024, and it said they cause harm via a broad spectrum of criminality.
Regular Reg readers will be familiar with the idea of Com networks and their involvement in cybercrime – from data breaches to fraud to ransomware. Com groups are sometimes made up of a new generation of English-speaking cybercriminal.
Also of special concern are the Com groups that share misogynistic content, sexual abuse material, and in Burrows’ experience, those that groom even younger people into carrying out acts of self-harm.
He told the committee: “We are seeing them commit a whole range of truly appalling harms, including, essentially, a new type of grooming focused on suicide and self-harm driven by sadistic behaviors.
On something like child sexual abuse, I think Ofcom is doing a very good job of understanding and articulating the risks, but on these newer harms, they are not where they need to be – Andy Burrows, CEO of the Molly Rose Foundation.
“Now as someone who’s worked in this space for decades, I have to say this is probably one of the threat types that I find more disturbing and chilling than anything else that I have seen,” he added.
“There are law enforcement agencies who are queuing up to say that this is a huge concern that we are starting to see children, and particularly girls, being groomed for purposes of self-harm and suicide.
“The most appalling egregious acts of harm, stories of girls being coerced into self-harm acts relating to the groups and, you know, I’ve heard from parents here in the UK who are desperate to see the regulator take action, and some of those risks are not being recognized.”
An Ofcom spokesperson told The Register: “Online safety rules came into force recently and change is already happening. The majority of the top 100 most popular adult sites in the UK have now deployed an age check, accounting for nearly two thirds of daily visits to adult sites in the UK. We’ve also seen popular social media, dating, gaming, and messaging apps introduce age assurance to prevent children accessing harmful content.
“We’re holding platforms to account and launching swift enforcement action where we have concerns. We’ve already launched investigations into 69 sites and apps, and expect to announce more in the coming weeks and months.
“Technology and harms are constantly evolving, and we’re always looking at how we can make life safer online. We’ve already put forward proposals for more protections that we want to see tech firms roll out.”
Ofcom chief executive Melanie Dawes will field questions from the committee during a session next week, as part of a continuing effort to gather views on the regulator’s latest proposals. ®