“A major reset of the internet to make it much safer” is how Ofcom’s Gill Whitehead described the communications watchdog’s child safety announcements to me.
But can it really deliver that kind of a sea-change in the protection of children online?
Turning faulty tech off then on again is a tried and trusted fix, but “resetting the net” is considerably more challenging.
First of all, consider the scale of the task: while the focus is on the largest and riskiest social media firms, over 150,000 services fall under the Online Safety Act, the new law Ofcom must enforce.
According to Ms Whitehead, the big tech firms are already taking action.
She pointed to measures by Facebook and Instagram owner Meta to combat grooming, and steps taken by streaming site Twitch, owned by Amazon, to stop underage users seeing “mature” content.
But the problem goes much wider than that.
Internet Matters, which provides advice on online safety, has just published research which suggests one in seven teenagers aged 16 and under have experienced a form of image-based sexual abuse, with more than half saying that a young person known to them was to blame.
And it will be the second half of 2025 before the new rules come into force – child safety campaigners say that’s not fast enough, and the measures don’t go far enough.
And remember, this announcement is of a consultation, which will likely be an exchange between the regulator, tech firms, experts, parents and a range of tenacious activist groups.
Age checks
Among the 40 practical measures in the draft Children’s Safety Codes of Practice, some will be particularly controversial.
One contentious area is how tech firms check whether their users are children, and if they are children, that they are old enough to use the service.
The regulator calls this “age assurance”. It doesn’t specify exactly how this must be done – but it is clear that simply ticking a box or entering a date of birth won’t do.
It has previously suggested tech which scans a user’s face and uses artificial intelligence to estimate age could be acceptable, if used in conjunction with a demand for further proof of age.
But age-checks could mean tens of millions of UK social media users, mostly adults, providing information to the tech firms – or third-party age-check businesses.
Privacy campaigners have already pushed back on this point. Jim Killock, of digital rights campaigners the Open Rights Group, wrote:
“Adults will be faced with a choice: either limit their freedom of expression by not accessing content, or expose themselves to increased security risks that will arise from data breaches and phishing sites”.
But some of the third-party age-check firms disagree, with Yoti head Robin Tombs telling the BBC its systems can check ages of people “without sharing any of their identity details with the websites or apps they are trying to access”.
New systems also attempt to guard against obvious work arounds, such as using a photo of an older person, by checking for “liveness”.
But some argue age-checks will be counter-productive.
“The worst thing is saying to a youngster ‘you can’t look at this’,” Surrey University professor Alan Woodward told the BBC.
“They’ll find ways around it, whether it’s using VPNs (virtual private networks) to go via routes where it doesn’t require that or where they can sign on with somebody else’s details.”
While he supports stopping children viewing some content, he worries some may respond by seeking out darker corners of the internet where age-checks are not enforced.
And Ofcom’s own data suggests a significant minority of parents can be willing collaborators in allowing underage children to use social media sites under the minimum age.
For example, a parent or an older sibling opening an account that a child then uses will be harder to guard against.
Meta boss Mark Zuckerberg has previously argued he favours making app stores, like those operated by Apple and Google, check ages instead – but these important gatekeepers aren’t covered by the consultation.
Ofcom told me that it will consult on the role of app stores in the protection of children, and the government would have the power to introduce new duties for app stores if the report suggested that it was necessary.
But that won’t happen until 2026.