Liz Kendall has expressed her strong concerns and disappointment to Ofcom regarding delays in the implementation of its online safety responsibilities. The Technology Secretary criticized the slow progress of the media regulator, emphasizing that families nationwide have been waiting too long for the protections outlined in the Online Safety Act (OSA).
Specifically, Kendall highlighted the issue of online dissemination of antisemitic content, underscoring the government’s prioritization of addressing antisemitism. Ofcom’s delay in enforcing its new duties, which pertain to legal but harmful content, including hateful and abusive material related to race, religion, sexual orientation, gender identity, and disability, has been a point of contention.
The new duties would require social media platforms to allow adults the option to control the presence of such content in their feeds, with the content already prohibited for children. Notably, Ofcom’s roadmap indicates a delay in publishing the categorization register and consulting on additional duties until approximately July 2026.
Although the OSA was enacted in October 2023, Ofcom only recently began utilizing some of its new powers, drawing criticism for the prolonged consultations on updating its guidelines. In her communication, Kendall acknowledged the importance of a robust regulatory framework but expressed disappointment in the delays in implementing additional duties on categorized services.
She stressed the need for a sense of urgency in completing the implementation of the outstanding duties to safeguard women, girls, and users from harmful content and antisemitism. Kendall urged Ofcom to expedite its processes, particularly concerning user empowerment responsibilities. The government remains committed to supporting Ofcom in prioritizing user safety and leveraging the Act’s provisions effectively.
In a dedicated section on combating antisemitism, Kendall reiterated the government’s commitment to addressing the spread of antisemitic content. Meanwhile, an Ofcom spokesperson cited external factors, including a legal challenge against the government, affecting the timeline for categorization. Notwithstanding these challenges, efforts are ongoing to enforce legal obligations on online platforms to protect users, especially children.
Ofcom’s children’s code of conduct, effective since July, mandates online platforms to implement stringent age verification measures, such as facial scans and ID checks, to prevent underage access to inappropriate content like pornography. Furthermore, platforms are required to address harmful content promptly, including issues like self-harm, suicide, eating disorders, and dangerous online trends.
In parliamentary discussions, MPs called for action against chatbots promoting self-harm and suicide among young people. While Minister Narayan affirmed the Act’s coverage of AI-based tools, concerns were raised about the influence of chatbots on vulnerable individuals. The government is committed to addressing gaps in the legislation to ensure robust enforcement against harmful online practices.
The commitment to online safety and protection of vulnerable users remains a key focus for policymakers, as they navigate the evolving digital landscape to create a safer online environment for all.
