On February 4, 2020, the United Kingdom’s Centre for Data Ethics and Innovation (“DEI”) published its final report on “online targeting” (the “Report”), examining practices used to monitor a person’s online behaviour and subsequently customize their experience. In October 2018, the UK government appointed the DEI, an expert committee that advises the UK government on how to maximize the benefits of new technologies, to explore how data is used in shaping peoples’ online experiences. The Report sets out its findings and recommendations.

The Report describes common online targeting practices, and examines whether and how the regulatory landscape needs to change to address potential harms associated with online targeting, while recognizing that such targeting can be beneficial. Ultimately, the DEI does not propose imposing specific restrictions on online targeting, but states that an independent regulator should be appointed to focus on improving (i) accountability, (ii) transparency, and (iii) user empowerment as it relates to online targeting practices. In these areas, the Report highlights that companies are falling short of the standards for the ethical use of technology set out in the OECD’s human-centred principles on AI (to which the UK has subscribed).

More generally, the DEI considers that the regulator of “online harms” in the UK (the “Regulator”) should supervise online targeting. Although the Report acknowledges that the UK Information Commissioner’s Office (“ICO”) and Competition and Markets Authority (“CMA”) have played a role in supervising online targeting to date, it notes that their respective areas of competence (data protection and competition law respectively) may not be sufficient to address all possible harms arising from online targeting. A regulator with a broader remit is required, although the Report emphasizes that the Regulator should collaborate with other regulators through formal coordination mechanisms.

The Report concludes by making recommendations in three principal areas:

Accountability

  • The Regulator should focus on online targeting systems and should prepare a code of practice incorporating standards for those systems, and require online platforms to assess and explain the impact of their systems.
  • The Regulator should have the power to compel the production of information from companies.
  • Oversight of online targeting should cover all types of content (including, but not limited to, advertising), and the Regulator should have a duty to protect freedom of expression and privacy.
  • UK government should develop a code on online targeting in the public sector to promote “safe, trustworthy innovation in the delivery of personalised advice and support”.

Transparency

  • The Regulator should be able to force companies to give independent experts access to their data to audit systems, and to conduct research with potential significance for public policy.
  • Platforms should be required to host publicly-accessible archives of online political ads, “opportunity” ads (e.g., ads for jobs, credit and housing), and ads for age-restricted products.
  • UK government should consider formal mechanisms for collaboration with platforms to tackle “coordinated inauthentic behaviour”.

User Empowerment

  • The Regulator’s approach should encourage platforms to provide people with greater information and control over online targeting. The DEI supports the CMA’s proposal for a duty of “fairness by design” on online platforms (which complements the duty of “data protection by design” under Art. 25 GDPR).
  • The Regulator and other authorities should increase coordination of their digital literacy campaigns.
  • The DEI supports the UK government’s plans for to ensure online electoral adverts are labelled to identify paid-for and targeted content (which will be complemented by the ICO’s forthcoming code of practice on the use of personal data in political campaigning).
  • The Report also supports assistance for emerging “data intermediaries” (i.e., entities mandated by users to interact with digital services, providing centralized consent management and authentication services, possibly with fiduciary duties).

It remains to be seen whether the UK government will implement these recommendations — they may form part of the final online harms package when it is completed. The team at Covington will continue to monitor developments in this area.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Paul Maynard Paul Maynard

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online…

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online platforms. He also advises clients on how to respond to law enforcement demands, particularly where such demands are made across borders.

Paul advises emerging and established companies in various sectors, including online retail, software and education technology. His practice covers advice on new legislative proposals, for example on e-privacy and cross-border law enforcement access to data; advice on existing but rapidly-changing rules, such the GDPR and cross-border data transfer rules; and on regulatory investigations in cases of alleged non-compliance, including in relation to online advertising and cybersecurity.