On 15 January 2024, the UK’s Information Commissioner’s Office (“ICO”) announced the launch of a consultation series (“Consultation”) on how elements of data protection law apply to the development and use of generative AI (“GenAI”). For the purposes of the Consultation, GenAI refers to “AI models that can create new content e.g., text, computer code, audio, music, images, and videos”.

As part of the Consultation, the ICO will publish a series of chapters over the coming months outlining their thinking on how the UK GDPR and Part 2 of the Data Protection Act 2018 apply to the development and use of GenAI. The first chapter, published in tandem with the Consultation’s announcement, covers the lawful basis, under UK data protection law, for web scraping of personal data to train GenAI models. Interested stakeholders are invited to provide feedback to the ICO by 1 March 2024.

The Lawful Basis for Scraping Personal Data

In its first chapter, the ICO acknowledges that legitimate interests, pursuant to Article 6(1)(f) of the UK GDPR, can be a lawful basis for using web-scraped personal data to train GenAI models. The ICO also notes that, as part of complying with the lawfulness principle of data protection law, developers need to ensure that their processing is not “in breach of any other legislation outside of data protection such as intellectual property or contract law.”

The Three-Part Test for Legitimate Interests

The ICO maintains that, in order for a GenAI model developer to rely on the legitimate interests lawful basis, it must pass the three-part legitimate interests test:

  1. Purpose test: GenAI model developers must first demonstrate a valid interest for processing web-scraped personal data. The ICO acknowledges that such an interest “could be the business interest in developing a model and deploying it for commercial gain, either on their own platform or bringing it into the market for third parties to procure” (e.g., offering a GenAI chatbot to consumers).
  • Necessity test: Processing of web-scraped data must be necessary to achieve the interest identified in the purpose test. To this end, the ICO notes that, “currently, most generative AI training is only possible using the volume of data obtained though large-scale scraping”.
  • Balancing test: If the first two limbs of the test are satisfied, the final step is to determine whether the interests, rights and freedoms of individuals override those pursued by the GenAI developer or third party. The ICO identifies two categories of potential risks that AI developers should balance against their own interests:
  • Upstream risks: The ICO states that, if people are not aware that their personal data is being processed, they can lose control over that data by virtue of being unable to exercise their information rights (e.g., the right of access under Article 15 of the UK GDPR); and
  • Downstream risks: According to the ICO, third parties can use GenAI models to generate inaccurate information about people which may result in reputational harm.

Additionally, the ICO notes that there are a number of risk mitigations that may help GenAI developers pass the third part of the legitimate interests test, including: (i) implementing technical and organisational controls over a specific deployment of a model; (ii) monitoring the use of the model (e.g., via API access); and (iii) specifying contractual controls with third parties that limit how the model is used.

Next Steps

Interested stakeholders are invited to provide feedback on the ICO’s first chapter by completing a survey or emailing the ICO by 1 March 2024. The ICO will use input received to update its guidance on AI and other related products.

The ICO also announced that, moving forward, it intends to produce additional chapters with analysis on topics including: (i) how the purpose limitation principle plays out in the context of GenAI development and deployment; (ii) expectations around complying with the accuracy principle; and (iii) expectations around complying with data subject rights.

***

Covington regularly advises the world’s top technology companies on their most challenging regulatory, compliance, and public policy issues in the UK, EU and other major markets. We are monitoring developments in AI policy and regulation closely and will be updating this site regularly – please watch this space for further updates.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Marianna Drake Marianna Drake

Marianna Drake counsels leading multinational companies on some of their most complex regulatory, policy and compliance-related issues, including data privacy and AI regulation. She focuses her practice on compliance with UK, EU and global privacy frameworks, and new policy proposals and regulations relating…

Marianna Drake counsels leading multinational companies on some of their most complex regulatory, policy and compliance-related issues, including data privacy and AI regulation. She focuses her practice on compliance with UK, EU and global privacy frameworks, and new policy proposals and regulations relating to AI and data. She also advises clients on matters relating to children’s privacy, online safety and consumer protection and product safety laws.

Her practice includes defending organizations in cross-border, contentious investigations and regulatory enforcement in the UK and EU Member States. Marianna also routinely partners with clients on the design of new products and services, drafting and negotiating privacy terms, developing privacy notices and consent forms, and helping clients design governance programs for the development and deployment of AI technologies.

Marianna’s pro bono work includes providing data protection advice to UK-based human rights charities, and supporting a non-profit organization in conducting legal research for strategic litigation.

Photo of Will Capstick Will Capstick

Will Capstick is an associate in the Corporate Practice Group in the London office. He advises clients on a broad range of corporate matters.

Will also has experience advising clients operating in the digital media space in relation to the creation, acquisition, and…

Will Capstick is an associate in the Corporate Practice Group in the London office. He advises clients on a broad range of corporate matters.

Will also has experience advising clients operating in the digital media space in relation to the creation, acquisition, and distribution of content.

Will is committed to pro bono and provides ongoing support to a charity in challenging the death penalty in the US as well as immigration law advice to families seeking leave to remain in the UK.

Photo of Marty Hansen Marty Hansen

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues, including related to artificial intelligence. Martin has…

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues, including related to artificial intelligence. Martin has extensive experience in advising clients on matters arising under EU and U.S. law, UK law, the World Trade Organization agreements, and other trade agreements.

Photo of Lisa Peets Lisa Peets

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more…

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more than two decades, she has worked closely with many of the world’s best-known technology companies.

Lisa counsels clients on a range of EU and UK legal frameworks affecting technology providers, including data protection, content moderation, artificial intelligence, platform regulation, copyright, e-commerce and consumer protection, and the rapidly expanding universe of additional rules applicable to technology, data and online services.

Lisa also supports Covington’s disputes team in litigation involving technology providers.

According to Chambers UK (2024 edition), “Lisa provides an excellent service and familiarity with client needs.”

Photo of Mark Young Mark Young

Mark Young is an experienced tech regulatory lawyer and a vice-chair of Covington’s Data Privacy and Cybersecurity Practice Group. He advises major global companies on their most challenging data privacy compliance matters and investigations. Mark also leads on EMEA cybersecurity matters at the…

Mark Young is an experienced tech regulatory lawyer and a vice-chair of Covington’s Data Privacy and Cybersecurity Practice Group. He advises major global companies on their most challenging data privacy compliance matters and investigations. Mark also leads on EMEA cybersecurity matters at the firm. In these contexts, he has worked closely with some of the world’s leading technology and life sciences companies and other multinationals.

Mark has been recognized for several years in Chambers UK as “a trusted adviser – practical, results-oriented and an expert in the field;” “fast, thorough and responsive;” “extremely pragmatic in advice on risk;” “provides thoughtful, strategic guidance and is a pleasure to work with;” and has “great insight into the regulators.” According to the most recent edition (2024), “He’s extremely technologically sophisticated and advises on true issues of first impression, particularly in the field of AI.”

Drawing on over 15 years of experience, Mark specializes in:

  • Advising on potential exposure under GDPR and international data privacy laws in relation to innovative products and services that involve cutting-edge technology, e.g., AI, biometric data, and connected devices.
  • Providing practical guidance on novel uses of personal data, responding to individuals exercising rights, and data transfers, including advising on Binding Corporate Rules (BCRs) and compliance challenges following Brexit and Schrems II.
  • Helping clients respond to investigations by data protection regulators in the UK, EU and globally, and advising on potential follow-on litigation risks.
  • Counseling ad networks (demand and supply side), retailers, and other adtech companies on data privacy compliance relating to programmatic advertising, and providing strategic advice on complaints and claims in a range of jurisdictions.
  • Advising life sciences companies on industry-specific data privacy issues, including:
    • clinical trials and pharmacovigilance;
    • digital health products and services; and
    • engagement with healthcare professionals and marketing programs.
  • International conflict of law issues relating to white collar investigations and data privacy compliance (collecting data from employees and others, international transfers, etc.).
  • Advising various clients on the EU NIS2 Directive and UK NIS regulations and other cybersecurity-related regulations, particularly (i) cloud computing service providers, online marketplaces, social media networks, and other digital infrastructure and service providers, and (ii) medical device and pharma companies, and other manufacturers.
  • Helping a broad range of organizations prepare for and respond to cybersecurity incidents, including personal data breaches, IP and trade secret theft, ransomware, insider threats, supply chain incidents, and state-sponsored attacks. Mark’s incident response expertise includes:
    • supervising technical investigations and providing updates to company boards and leaders;
    • advising on PR and related legal risks following an incident;
    • engaging with law enforcement and government agencies; and
    • advising on notification obligations and other legal risks, and representing clients before regulators around the world.
  • Advising clients on risks and potential liabilities in relation to corporate transactions, especially involving companies that process significant volumes of personal data (e.g., in the adtech, digital identity/anti-fraud, and social network sectors.)
  • Providing strategic advice and advocacy on a range of UK and EU technology law reform issues including data privacy, cybersecurity, ecommerce, eID and trust services, and software-related proposals.
  • Representing clients in connection with references to the Court of Justice of the EU.