On September 19, 2023, the UK’s Online Safety Bill (“OSB”) passed the final stages of Parliamentary debate, and will shortly become law. The OSB, which requires online service providers to moderate their services for illegal and harmful content, has been intensely debated since it was first announced in 2020, particularly around the types of online harms within scope and how tech companies should respond to them. The final version is lengthy and complex, and will likely be the subject of continued debate over compliance, enforcement, and whether it succeeds in making the internet safer, while also protecting freedom of expression and privacy.
What Services Are Covered?
As we have written previously, the OSB applies to “user-to-user” services (services through which users share and access content online, such as social media services and online message services), search services, and services that provide pornographic content. Certain services are exempted, including email, SMS and MMS services, and internal business services. To be within scope, the service must have “links” to the UK, meaning it has a significant number of UK users, the UK is a target market, or the service is capable of being used in the UK and there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK arising from its use.
What’s Changed in the Final Version?
The OSB has evolved significantly since its initial introduction, and there have been a number of changes since our last update. These include:
One of the more controversial amendments to the OSB was last year’s move away from requiring service providers to remove content that is “legal but harmful”; the OSB as passed instead requires covered services to remove illegal content and content that is harmful to children. The OSB includes a wide range of content that is considered “harmful to children,” including pornographic content, content that promotes self-injury, eating disorders or behaviors associated with eating disorders, and bullying content.
Ofcom is required to carry out reviews of the incidence of such content on covered services, and publish reports on these reviews every three years.
The OSB as passed empowers Ofcom to issue technical notices requiring service providers to use “accredited technology” (either alone or in conjunction with human moderators) to identify terrorism content and child sexual abuse material (“CSAM”) on their services, whether communicated publicly or privately. Some worry that Ofcom might use this power to require service providers to scan and review encrypted messages, and a number of service providers have spoken out against this change. Ofcom must commission an independent expert report from a “skilled person” before issuing such notices to assess how a notice to scan messages would impact privacy and freedom of expression in any given instance, and to assist Ofcom in deciding whether to issue a technical notice.
Age Verification for Certain Sites:
The OSB imposes obligations on service providers to use age verification or age estimation techniques to prevent children from encountering certain harmful content, including pornographic content and content that promotes suicide, acts of self-injury, or eating disorders.
Providers of higher-risk services (such as social media and pornography sites) will also be required to implement identity verification for adult users in order to restrict the use of anonymous profiles.
The final version of the OSB introduces a number of new “communications offences”, targeting communications containing abusive messages intended to cause “non-trivial psychological or physical harm” and threatening messages, as well as a number of new offences for sharing or threatening to share intimate images.
Ofcom—tasked with enforcing the OSB—plans to take a phased approach to implementation and enforcement. Once the OSB receives Royal Assent and becomes law, Ofcom will launch its first consultation process on the standards and codes of practice for service providers to follow in tackling illegal content. Ofcom will then turn to finalizing standards for ensuring child safety. Finally, Ofcom will work to categorize service providers, and ensuring their compliance with any additional obligations, including producing transparency reports and preventing fraudulent advertising. (Certain categories of service providers—designated by Ofcom as Category 1, 2A or 2B services—will be subject to additional obligations due to their size, service functionality, and the likelihood of harm to end-users. The thresholds for categorizing higher-risk services will be set out in secondary legislation yet to be published.)
Service providers that fail to comply with the OSB and Ofcom’s standards may face investigation and potential fines of up to GBP 18 million or 10% of qualifying worldwide turnover, whichever is greater. Senior managers may also be held personally liable for failing to take all reasonable steps to prevent certain offences being committed by the service provider. In serious cases, Ofcom may seek a court order imposing “business disruption measures”, which could require relevant ISPs to block access to the non-compliant service.
Service providers should expect Ofcom to commence active supervision and enforcement of the OSB before the end of this year. Ofcom has stated that it will engage with higher-risk service providers to identify issues early, and will provide support to less well-resourced providers to assist with compliance.
Please reach out to a member of Covington’s UK-based Technology & Communications Regulatory team if you have any questions on the OSB.