Thursday, April 3, 2025

Implementation of Online Safety Act Measures Begins

Ofcom is gearing up to enforce its Illegal Content codes under the Online Safety Act (OSA), following a three-month preparation period for firms. As of March 17, 2025, online service providers must comply with new safety measures or risk enforcement actions.

These measures include appointing a senior executive responsible for compliance, properly funding content moderation teams, enhancing algorithms to control illegal content spread, and shutting down accounts linked to terrorist organizations. Companies also need to actively find and address child sexual exploitation and abuse (CSEA) material using advanced tools like automated hash-matching.

Over the last three months, companies have been expected to carry out risk assessments on the harms present on their platforms. Now, they must clearly show Ofcom how they’re tackling illegal content and actively monitoring it. Peter Kyle, the technology secretary, highlighted that social media platforms now have a legal obligation to prevent and remove harmful content. In 2023 alone, the Internet Watch Foundation dealt with over 290,000 instances of child abuse content.

Kyle criticized tech firms for treating safety as an afterthought and emphasized that this is just the beginning. He promised decisive actions against emerging threats and affirmed that the OSA is a foundational step, not the end of the conversation about online safety.

The OSA affects more than 100,000 online services, including search engines and user-generated content platforms. It outlines 130 priority offenses that cover various harmful content types, including child sexual abuse, terrorist content, and fraud. Ofcom made it clear that it will take action against providers who fail to quickly address these risks. Non-compliance could lead to hefty fines—up to 10% of global revenue or £18 million, whichever is greater.

Looking ahead, Ofcom will hold a consultation in spring 2025 to discuss expanding the codes. This could involve stricter measures against accounts sharing child sexual abuse material and emergency response protocols for crises.

Under the OSA, Ofcom can require messaging providers to implement software for scanning phones for illegal material. This client-side scanning compares encrypted message hash values against a database of illegal content. However, some encrypted communication providers warn that this level of surveillance might severely compromise user safety and privacy.

Mark Jones, a legal partner, stated that the new codes shift the responsibility to firms to show they’re addressing illegal harms proactively. It’s a significant change from the previous reactive stance. Factors like the type of service and user base will affect the specific measures firms are expected to implement.

Iona Silverman from Freeth’s pointed out that, despite the three-month prep period, there’s little evidence firms are taking compliance seriously. For example, Meta announced it would remove third-party fact-checking, opting for a community notes-style approach, which could lead to catching “less bad stuff,” according to Mark Zuckerberg.

Silverman supported the government’s stance that the OSA is about curbing criminality, not restricting free speech. She stressed that for the OSA to be effective, Ofcom must enforce the regulations rigorously, including imposing significant fines on platforms that do not prioritize user safety.