“If they don’t start to seriously change the way they operate their services, then I think those demands for things like bans for children on social media are going to get more and more vigorous,” she said.
“I’m asking the industry now to get moving, and if they don’t they will be hearing from us with enforcement action from March.”
But critics say the OSA fails to tackle a wide range of harms for children.
Andy Burrows, head of the Molly Rose Foundation, said the organisation was “astonished and disappointed” by a lack of specific, targeted measures for platforms on dealing with suicide and self-harm material in the guidance.
“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life,” he said.
Under Ofcom’s codes, platforms will need to identify if, where and how illegal content might appear on their services and ways they will stop it reaching users.
According to the OSA, this includes content relating to child sexual abuse material (CSAM), controlling or coercive behaviour, extreme sexual violence, promoting or facilitating suicide and self-harm.
Ofcom began consulting on its illegal content codes and guidance in November 2023.
It says it has now “strengthened” its guidance for tech firms in several areas.
This includes clarifying requirements to remove intimate image abuse content, which and helping guide firms on how to identify and remove material related to women being coerced into sex work.