On 26 October 2023, the UK’s Online Safety Bill received Royal Assent, becoming the Online Safety Act (“OSA”). The OSA imposes various obligations on tech companies to prevent the uploading of, and rapidly remove, illegal user content—such as terrorist content, revenge pornography, and child sexual exploitation material—from their services, and also to take steps to reduce the risk that users will encounter such material (please see our previous blog post on the Online Safety Bill).
The OSA applies to both user-to-user services (“U2U”) and search services. A U2U is a service through which users can share content online, such as social media and online messaging services. A search service is an internet service that is, or includes, a search engine, such as Google.
If in-scope firms do not comply with the provisions of the OSA, Ofcom has the power to issue a fine of up to £18 million or 10% of global annual revenue, whichever is greater.
The majority of the OSA’s provisions will come into force in two months’ time. However, certain provisions of the act came into force yesterday, 26 October 2023. These provisions establish Ofcom, the UK communications regulator, as the online safety regulator, responsible for enforcing the OSA.
Ofcom will publish its first consultation on illegal harms on 9 November 2023. The consultation will contain proposals for how services can comply with the OSA’s illegal content safety duties and draft codes of practice. Thereafter, Ofcom has announced that it will take a phased approach to publishing guidance and codes of practice, prioritising the most serious categories of harm in the OSA, including: child safety, pornography, and the protection of women and girls.