On November 8, 2024, the UK’s communications regulator, the Office of Communications (“Ofcom”) published an open letter to online service providers operating in the UK regarding the Online Safety Act (“OSA”) and generative AI (the “Open Letter”).  In the Open Letter, Ofcom reminds online service providers that generative AI tools, such as chatbots and search assistants may fall within the scope of regulated services under the OSA.  More recently, Ofcom also published several pieces of guidance (some of which are under consultation) that include further commentary on how the OSA applies to generative AI services.

Application of the OSA to generative AI chatbot tools and platforms

The Open Letter gives the following examples of when and how Ofcom considers the OSA will apply to generative AI and chatbots:

  • Generative AI chatbots that enable users to share text, images or videos generated by the chatbot with other users will be deemed “user-to-user services” subject to the OSA.  This includes functions that enable multiple users to interact with a chatbot at the same time.
  • If a platform enables a user to generate their own generative AI chatbot (for example a bot that mimics personas of real or fictional people) and makes it available for others to use and interact with, any text, images or videos created by this user-generated chatbot will be regulated as a “user-to-user service” under the OSA.
  • Generative AI tools that search multiple sites or databases will be deemed “search services” regulated by the OSA.  This includes generative AI tools that (i) modify, augment or facilitate the delivery of search results on an existing search engine or (ii) provide ‘live’ internet results on a standalone platform.
  • Sites and apps that include generative AI tools capable of generating pornographic material could be considered services providing pornographic content. Such services could therefore subject to the obligations under Part 5 of the OSA, including the requirement to implement highly effective age assurance.  Providers could implementing certain safeguards, such as using keyword blockers to prevent the generation of pornographic content, to exclude their services from the scope of Part 5 (See section 3.21 of Ofcom’s Guidance on highly effective age assurance).

Next Steps

The OSA obligations have started to apply in phases throughout 2025.  Below is a timeline of the upcoming deadlines for compliance with certain OSA obligations:

  • Providers of pornographic content must implement highly effective age assurance measures by January 17, 2025.
  • Providers of user-to-user services and search services must complete (i) illegal content risk assessment by March 16, 2025, and (ii) children’s access assessment by April 16, 2025.
  • Providers of services likely to be accessed by children must complete children’s risk assessments by July 2025.

The Covington team continues to closely monitor developments in online safety, including the implementation of the Online Safety Act.  Please reach out to a member of the team if you have any questions.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Sam Jungyun Choi Sam Jungyun Choi

Recognized by Law.com International as a Rising Star (2023), Sam Jungyun Choi is an associate in the technology regulatory group in Brussels. She advises leading multinationals on European and UK data protection law and new regulations and policy relating to innovative technologies, such…

Recognized by Law.com International as a Rising Star (2023), Sam Jungyun Choi is an associate in the technology regulatory group in Brussels. She advises leading multinationals on European and UK data protection law and new regulations and policy relating to innovative technologies, such as AI, digital health, and autonomous vehicles.

Sam is an expert on the EU General Data Protection Regulation (GDPR) and the UK Data Protection Act, having advised on these laws since they started to apply. In recent years, her work has evolved to include advising companies on new data and digital laws in the EU, including the AI Act, Data Act and the Digital Services Act.

Sam’s practice includes advising on regulatory, compliance and policy issues that affect leading companies in the technology, life sciences and gaming companies on laws relating to privacy and data protection, digital services and AI. She advises clients on designing of new products and services, preparing privacy documentation, and developing data and AI governance programs. She also advises clients on matters relating to children’s privacy and policy initiatives relating to online safety.

Photo of Ruth Scoles Mitchell Ruth Scoles Mitchell

Ruth Scoles Mitchell is an associate in the Digital Media Group in London. Ruth advises clients on digital media regulation, in particular in the context of international launches of digital products and content services. Ruth has advised various clients active in the technology…

Ruth Scoles Mitchell is an associate in the Digital Media Group in London. Ruth advises clients on digital media regulation, in particular in the context of international launches of digital products and content services. Ruth has advised various clients active in the technology space in this regard. Ruth also advises on video content licensing and commercial matters.

Photo of Madelaine Harrington Madelaine Harrington

Madelaine Harrington is an associate in the technology and media group. Her practice covers a wide range of regulatory and policy matters at the cross-section of privacy, content moderation, artificial intelligence, and free expression. Madelaine has deep experience with regulatory investigations, and has…

Madelaine Harrington is an associate in the technology and media group. Her practice covers a wide range of regulatory and policy matters at the cross-section of privacy, content moderation, artificial intelligence, and free expression. Madelaine has deep experience with regulatory investigations, and has counseled multi-national companies on complex cross-jurisdictional fact-gathering exercises and responses to alleged non-compliance. She routinely counsels clients on compliance within the EU regulatory framework, including the General Data Protection Regulation (GDPR), among other EU laws and legislative proposals.

Madelaine’s representative matters include:

coordinating responses to investigations into the handling of personal information under the GDPR,
counseling major technology companies on the use of artificial intelligence, specifically facial recognition technology in public spaces,
advising a major technology company on the legality of hacking defense tactics,
advising a content company on compliance obligations under the DSA, including rules regarding recommender systems.

Madelaine’s work has previously involved representing U.S.-based clients on a wide range of First Amendment issues, including defamation lawsuits, access to courts, and FOIA. She maintains an active pro-bono practice representing journalists with various news-gathering needs.