Children's Privacy

On March 25, 2026, the UK’s Office of Communications (“Ofcom”) and the Information Commissioner’s Office (“ICO”) published a joint statement setting out their common expectations for age assurance on online services (“Joint Statement”). The Joint Statement is aimed at services likely to be accessed by children that fall within the scope of the Online Safety Act 2023 (“OSA”) and UK data protection legislation, and is designed to help providers comply with both their online safety and data protection obligations when deploying age assurance.

The Joint Statement arrives alongside a broader push from both regulators—including Ofcom’s recent call to action directed at major tech firms, an open letter from the ICO urging platforms to strengthen their age checks, and several enforcement actions by both regulators.

Continue Reading Ofcom and ICO Issue Joint Statement on Age Assurance

On March 2, 2026, the UK Department for Science, Innovation and Technology (“DSIT”) launched its consultation, titled “Growing up in the online world: a national conversation”. The consultation is open until 26 May 2026, after which the government will publish a summary of responses and its proposed approach. DSIT has indicated that it intends to move quickly on the consultation’s findings, drawing on newly granted powers that allow for accelerated implementation of online safety measures.

The consultation seeks views on a wide range of potential measures to strengthen children’s safety and wellbeing online, including more robust age‑assurance mechanisms, a statutory minimum age for social media, raising the UK’s age of digital consent, restrictions on certain features (such as livestreaming and disappearing messages), and new obligations for AI chatbots and generative‑AI services.

DSIT’s proposals could significantly expand regulatory expectations beyond the Online Safety Act 2023 (“OSA”)—including potential age‑based access limits (including differing safeguards as between teens and younger children), feature‑level restrictions, and enhanced duties for AI‑enabled services. Early engagement will be important to ensure that the government takes account of the views of affected service providers and understands the operational and technical implications of the measures proposed.

Continue Reading UK Government Launches Consultation on Children’s Online Experiences, Including New Obligations for AI

On 14 July 2025, the European Commission published its final guidelines on the protection of minors under the Digital Services Act (“DSA”) (the “Guidelines”). The Guidelines are intended to provide guidance to providers of online platforms that are “accessible to minors” on meeting their obligations to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service” (DSA, Art. 28(1)).

The European Commission published a draft version of the guidelines for consultation on 13 May 2025 (“Draft Guidelines”) (see our blog post here). The final Guidelines include some amendments to the Draft Guidelines on the basis of the feedback received during consultation, clarifying and building out further the recommended measures.

Although the Guidelines are non-binding, the Commission has made clear that it intends to use the Guidelines as a “significant and meaningful” benchmark when assessing in-scope providers’ compliance with Article 28(1) DSA.

Continue Reading European Commission Makes New Announcements on the Protection of Minors Under the Digital Services Act

On 24 April 2025, Ofcom published a statement on the protection of children online (“Statement”). The Statement includes Ofcom’s final Children’s Risk Assessment Guidance (“Guidance”). Publication of the Guidance triggers the deadline for service providers regulated by the Online Safety Act 2023 (“OSA”) to complete their first “children’s risk assessment” (“CRA”)—specifically, 24 July 2025.  The Statement also confirms that the draft Protection of Children Codes of Practice for user-to-user and search services (“Codes”) have been laid before Parliament. Subject to completion of the Parliamentary process, providers must comply with the OSA’s “safety duties protecting children” from 25 July 2025.

Who do the Codes and Guidance apply to?

The Codes and Guidance apply to providers of “user-to-user” and “search” services that are “likely to be accessed by children”, which is determined based on a test set out in the OSA. In-scope providers were required to have completed an assessment—known as a “children’s access assessment”— by 16 April 2025 to determine if their services satisfy this test.

Continue Reading Ofcom publishes statement on the protection of children online

On April 3, 2024, the UK Information Commissioner’s Office (“ICO”) published its 2024-2025 Children’s code strategy (the “Strategy”), which sets out its priorities for protecting children’s personal information online. This builds on the Children’s code of practice (“Children’s Code”) which the ICO introduced in 2021 to ensure that all online services which process children’s data are designed in a manner that is safe for children.

Continue Reading ICO sets outs 2024-2025 priorities to protect children online

On 20 February, 2024, the Governments of the UK and Australia co-signed the UK-Australia Online Safety and Security Memorandum of Understanding (“MoU”). The MoU seeks to serve as a framework for the two countries to jointly deliver concrete and coordinated online safety and security policy initiatives and outcomes to support their citizens, businesses and economies.

The MoU comes shortly after the UK Information Commissioner’s Office (“ICO”) introduced its guidance on content moderation and data protection (see our previous blog here) to complement the UK’s Online Safety Act 2023, and the commencement of the Australian online safety codes, which complement the Australian Online Safety Act 2021.

The scope of the MoU is broad, covering a range of policy areas, including: harmful online behaviour; age assurance; safety by design; online platforms; child safety; technology-facilitated gender-based violence; safety technology; online media and digital literacy; user privacy and freedom of expression; online child sexual exploitation and abuse; terrorist and violent extremist content; lawful access to data; encryption; misinformation and disinformation; and the impact of new, emerging and rapidly evolving technologies such as artificial intelligence (“AI”).

Continue Reading UK and Australia Agree Enhanced Cross-Border Cooperation in Online Safety and Security

The UK Government has announced plans to introduce new rules on online advertising for online platforms, intermediaries, and publishers.  The aim is to prevent illegal advertising and to introduce additional protections against harmful online ads for under-18s.  Full details are set out in its recently published response (“Response”) to the Department for Culture, Media & Sport’s 2022 Online Advertising Programme Consultation (“Consultation”). 

The new rules would sit alongside the proposed UK Online Safety Bill (“OSB”), which addresses rules on user-generated content (see our previous blog here).  Since the EU’s Digital Services Act (which starts to apply from February 2024, see our previous blog here) will not apply in the UK following Brexit, the OSB and any new rules following this Response, form the UK’s approach to regulating these matters, as distinct from the EU.

Continue Reading Further Regulation of Illegal Advertising: UK Government Publishes Response to its Online Advertising Programme Consultation

This quarterly update summarizes key legislative and regulatory developments in the first quarter of 2023 related to Artificial Intelligence (“AI”), the Internet of Things (“IoT”), connected and autonomous vehicles (“CAVs”), and data privacy and cybersecurity.

Continue Reading U.S. AI, IoT, CAV, and Privacy & Cybersecurity Legislative & Regulatory Update – First Quarter 2023

In this update, we detail the key legislative developments in the second quarter of 2021 related to artificial intelligence (“AI”), the Internet of Things (“IoT”), connected and automated vehicles (“CAVs”), and federal privacy legislation.  As we recently covered on May 12,  President Biden signed an Executive Order to strengthen the federal government’s ability to respond to and prevent cybersecurity threats, including by removing obstacles to sharing threat information between private sector entities and federal agencies and modernizing federal systems.  On the hill, lawmakers have introduced a number of proposals to regulate AI, IoT, CAVs, and privacy.
Continue Reading U.S. AI, IoT, CAV, and Privacy Legislative Update – Second Quarter 2021

On February 11, 2021, the European Commission launched a public consultation on its initiative to fight child sexual abuse online (the “Initiative”), which aims to impose obligations on online service providers to detect child sexual abuse online and to report it to public authorities. The consultation is part of the data collection activities announced in the Initiative’s inception impact assessment issued in December last year. The consultation runs until April 15, 2021, and the Commission intends to propose the necessary legislation by the end of the second quarter of 2021.
Continue Reading European Commission Launches Consultation on Initiative to Fight Child Sexual Abuse