Ofcom has just released its first code of practice aimed at tackling illegal online harms as part of the Online Safety Act (OSA). This gives businesses a three-month window to prepare before enforcement starts in March 2025.
The code, published on December 16, 2024, outlines clear steps providers must take to manage illegal content. They need to assign a senior executive to oversee compliance, adequately fund content moderation teams, and enhance algorithms to prevent the spread of illegal material. They also must deactivate accounts linked to terrorist organizations.
The OSA affects over 100,000 online services, including search engines and platforms for user-generated content. It identifies 130 “priority offences,” such as child sexual abuse, terrorism, and fraud. Companies must take proactive measures against these issues through their content moderation frameworks.
From March 16, 2025, providers need to assess the risks of illegal harms on their platforms and implement the prescribed safety measures. Ofcom is clear that it will act if companies don’t step up. Non-compliance can lead to fines of up to 10% of a company’s global revenue or £18 million, whichever is more.
Ofcom chief executive Melanie Dawes emphasized the shift toward accountability, noting that tech firms must prioritize user safety over profits. She warned that Ofcom will closely monitor compliance with these safety standards.
Technology secretary Peter Kyle echoed this urgency, calling the code a significant advance in online safety, requiring platforms to actively remove illegal content. He supports Ofcom’s enforcement actions, including potential fines or website shutdowns for non-compliance.
While the draft Statement of Strategic Priorities is still being finalized, it focuses on areas like safety design, transparency, and innovation in online safety technologies. Ofcom must report back to the government about its actions regarding these priorities and evaluate their effectiveness in creating safer online spaces.
Additionally, Ofcom plans to hold a consultation in spring 2025 to explore further code expansions, like banning accounts sharing child sexual abuse material and improving crisis response during emergencies.
Under Clause 122 of the OSA, Ofcom can require messaging services to implement client-side scanning. This involves scanning private messages for illegal content using hashed data. However, providers are concerned that this could compromise user privacy.
Mark Jones from Payne Hicks Beach pointed out that it took 14 months from the OSA’s Royal Assent to the code’s implementation, suggesting a lack of urgency. He noted that while Ofcom has enforcement powers, the key question remains whether they will be used effectively.
Xuyang Zhu from Taylor Wessing highlighted that companies face strict deadlines to comply, urging them to start their risk assessments and compliance measures now. She warned that the task won’t be easy, but timing is critical for avoiding penalties.
Ofcom earlier shared draft codes for online child safety, expecting platforms to implement strict age checks and content moderation practices to protect children. These standards include accountability measures for senior staff and regular reviews of risk management related to children’s safety.