Tech companies have been warned to protect young people online after MPs voted down a blanket social media ban for under-16s.
The Information Commissioner’s Office (ICO) and Ofcom, the communications regulator, have written to several platforms to demand stronger protections for children.
Ofcom has given Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube until the end of April to explain what actions they’re taking on age checks and preventing online grooming.
The platforms must also set out how they’re tackling harmful algorithms, and how they roll out updates for users, with Ofcom demanding an “end to product testing on children”.
Similarly, the ICO has written to TikTok, Snapchat, Facebook, Instagram, YouTube, and X – formerly Twitter – asking them how their age check policies keep children safe.
It comes after a Conservative-led push to ban under-16s from social media failed in the House of Commons, being voted down by 307 votes to 173.
After initially opposing the measure, ministers are now consulting on a ban, without committing to backing it.
Australia became the first country to implement a social media ban for children when its policy took effect in December last year.
Ofcom said its research had shown that minimum age policies of 13 were not being properly enforced, with 72% of children aged eight to 12 accessing sites and apps prohibited for their age.
Children ‘routinely exposed to risks’
Its chief executive Dame Melanie Dawes accused big tech companies of “failing to put children’s safety at the heart of their products”.
She continued: “There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.
“Without the right protections, like effective age checks, children have been routinely exposed to risks they didn’t choose, on services they can’t realistically avoid.
“That must now change quickly, or Ofcom will act.”
ICO chief executive Paul Arnold said: “With ever-growing public concern, the status quo is not working and industry must do more to protect children.”
‘No excuse’ for tech firms not to act
Ofcom said it will publicly report how platforms have responded in May, when it will also publish new research on how much impact the Online Safety Act has had on children’s online experiences in its first year.
The regulator said it “will be ready to take enforcement action” if not satisfied with the firms’ responses, including strengthening regulations.
While the ICO said it had contacted some of the “highest risk services” and warned that “further regulatory action” could await if they don’t take action.
Mr Arnold said: “Our message to platforms is simple: act today to keep children safe online.
“There’s now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place.”
Read more from Sky News:
Kids who want a social media ban
Parents’ anger as Zuckerberg faces court
The push was welcomed by the Molly Rose Foundation, which was set up in memory of a 14-year-old who took her own life after viewing harmful content on social media.
The charity said Ofcom was “turning up the heat on reckless tech firms and their dangerous products which continue to cause daily harm to children”.
Tech firms respond
In a statement, a YouTube spokesperson said the platform had been building products specifically for children and teenagers for more than 10 years, and was “designed to provide age-appropriate high-quality experiences”.
They continued: “We are surprised to see Ofcom move away from a risk-based approach, particularly given that we routinely update them and other regulators on our industry-leading work on youth safety.”
Meta, which operates Facebook and Instagram, said it had already implemented “many” of the solutions called for by regulators, including using AI to detect users’ age based on their activity, and facial age recognition technology.
They added: “We also place teens in Teen Accounts, which offer built-in protections that limit who can contact them, the content they see, and the time they spend on our apps.”
A spokesperson for Roblox said the platform was in “regular dialogue” with Ofcom about protecting players, and had launched more than 140 safety features in the past year, including mandatory age checks for access to chat features.
“While no system is ever perfect, we continue to strengthen protections designed to keep players safe and look forward to demonstrating our efforts in our ongoing dialogue with Ofcom,” they said.
The other platforms named have been contacted for comment.