News
Ofcom says tech and social media platforms ‘must enforce’ their minimum age rules
"The industry has not done enough."
Ofcom has issued an urgent warning, calling on major sites and apps to enforce their minimum age rules with highly-effective age checks.
As it examines continued failings by these services, the online safety regulator says it has this week written to the major sites and apps that young people use the most – including Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube – requiring them to prove to parents a ‘genuine commitment’ to protecting children online.
Since the UK’s online safety laws came into force last year, Ofcom says it has been investigating nearly a hundred different services.
The regulator has taken enforcement action, secured changes to disrupt the sharing of child sexual abuse material, and seen high-risk services either get in line or block access to the UK altogether, as well as ensuring that millions of daily visits to porn sites now require highly effective age checks.
Major platforms like X (formerly Twitter), Telegram, Discord, and Reddit have also introduced age controls to prevent children accessing adult or harmful content.
But still, there is more that needs to be done.
“While there are many examples of progress to be welcomed, the industry has not done enough,” Ofcom said in a damning statement.
It claims parents, carers, and guardians have ‘lost trust in tech firms’ ability to keep their children safe’, even despite the Government is consulting on further legislative measures to address public concern.
Four ‘clear’ demands for further action have been set out by the regulator this week – effective minimum-age policies and reinforcement of these, strict child grooming protections, safer feeds and algorithms for children, and an end to product testing – particularly AI tools – on children.

Ofcom says it has given the aforementioned platforms a deadline of 30 April to report back to it on the action they will take, and then the following month, the regulator will report on how the companies have responded and announce any next steps for regulatory action.
Speaking on the warning issued this week, Dame Melanie Dawes, Ofcom’s Chief Executive, said: “These online services are household names, but they’re failing to put children’s safety at the heart of their products.
“There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.
Read more:
- Should children be banned from social media? The Government wants you to help decide
- Social media crackdown announced by Government with potential ban for under 16s
- Man jailed after ‘abusing and isolating’ young Manchester teen he groomed on social media
“Without the right protections, like effective age checks, children have been routinely exposed to risks they didn’t choose, on services they can’t realistically avoid. That must now change quickly, or Ofcom will act.”
Featured Image – Julian Christ (via Unsplash)







