On December 15, 2020, the European Commission published its proposed Regulation on a Single Market for Digital Services, more commonly known as the Digital Services Act (“DSA Proposal”).  In publishing the Proposal, the Commission noted that its goal was to protect consumers and their fundamental rights online, establish an accountability framework for online services, and foster innovation, growth and competitiveness in the single market.  On the same day, the Commission also published its proposal for a Digital Markets Act (“DMA”), which would impose new obligations and restrictions on online services that act as “designated gatekeepers” (see our analysis of the DMA Proposal here).

I. Obligations on different categories of digital service providers

The DSA Proposal would impose new obligations on four distinct categories of online services, according to their role, size, and impact.  The categories are:

  1. Intermediary services, defined as “hosting”, “mere conduit” and “caching” services (art. 2(f));
  2. Hosting services, defined as a “service that consists of the storage of information provided by, and at the request of, a recipient of the service” (art. 2(f));
  3. Online platforms, defined as a hosting service that, “at the request of a recipient of the service, stores and disseminates [information] to the public” (art. 2(h)); and
  4. “Very large” online platforms, defined as “online platforms which provide their services to a number of average monthly active recipients . . . in the Union equal to or higher than 45 million” (art. 25(1)-(4)).

Each category of services is a subset of the one that precedes it and must comply with all the obligations that apply to it and those that apply to the preceding categories.  Thus, for instance, “online platforms” must also comply with the obligations that apply to “hosting services” and to “intermediary services.”

As discussed in Part II, providers’ compliance with their DSA obligations would be overseen and enforced by new national “Digital Services Coordinators” (“DSCs”) in each Member State.  Although the details of these obligations are too extensive to list here, they include the following:

A.                Obligations on intermediary services

The DSA largely restates the safe harbors from liability for intermediary services—namely, caching services, mere conduits, and hosting services—set out in Articles 12-14 of the EU E-Commerce Directive, 2000/31/EC. However, under the DSA, the safe harbor for hosting services will not apply to online platforms that allow consumers to engage in an e-commerce transaction where the consumer reasonably believes that the product, information, or service at issue is being offered by the platform itself, or a trader under its control (art. 5).

In addition, intermediary service providers would need to comply with Member State judicial and administrative orders to remove illegal content, and to provide information about specific users of the service (arts. 8 and 9). These providers would also need to disclose, in their terms of use, any restrictions they impose on user content (e.g., offensive or harmful but legal content), and any “policies, procedures, measures and tools” they use for content moderation (art. 12).  Finally, intermediary service providers would need to publish annual reports with data on aspects of their content moderation activities, including the number of orders they received and how they responded (art. 13).

B.                 Obligations on hosting services

Hosting services would need to do essentially two things (in addition to complying with the obligations on intermediary services).  First, they would need to establish a mechanism through which users could notify them of suspected illegal content on their services (art. 14).  Second, they would need to inform users when they remove their content, and must give such users a statement of reasons for the decision.  Providers also would need to publish their content removal decisions, along with the statement of reasons, in a publicly accessible database managed by the Commission (art. 15).

C.                 Obligations on online platforms

The DSA Proposal sets out significantly more obligations on “online platforms”—which, again, the DSA defines as a hosting service that, “at the request of a recipient of the service, stores and disseminates [information] to the public” (art. 2(h)).  These include obligations to:

  • Establish an internal complaint-handling procedure for users whose content has been removed, including where their service or account has been suspended or terminated (art. 17).
  • Allow users to challenge such decisions before an out-of-court dispute settlement body that has been certified by a DSC (art. 18).
  • Give priority review to notices of illegal content issued by “trusted flaggers”—entities designated by DSCs based on their particular expertise to detect and identify illegal content (art. 19).
  • Suspend users that frequently post “manifestly illegal content,” and individuals that frequently submit “manifestly unfounded” notices or complaints (art. 20).
  • Notify national law enforcement or judicial authorities where the platform suspects that a crime “involving a threat to the life or safety of persons” has taken or will take place (art. 21).
  • Demand identity-verifying information from traders that offer goods or services on the platform, and suspend traders that fail to provide the information (art. 22)
  • Provide basic transparency about advertisements that appear on the platform, including the identity of the person or entity on whose behalf the ad is displayed (art. 24)
  • Provide a more enhanced version of the annual transparency reports required by article 13, as well as filing reports every six months with their respective DSC on the average monthly active recipients of the service in each member state (art. 23)

D.                Obligations on very large platforms

The DSA Proposal reserves the most onerous obligations for “very large online platforms”—i.e., those with over 45 million average monthly active users in the EU.  These platforms would need to conduct risk assessments relating to illegal content on their services (art. 26); implement mitigating measures for any risks identified (art. 27); arrange for annual independent audits on their compliance with the DSA (art. 28); describe the “main parameters” of any “recommender systems” they use with respect content on their services (art. 29), and provide enhanced transparency about ads appearing on their services (art. 30).

Very large online platforms would also need to provide their DSC or the Commission with access to internal data (art. 31), assign an internal compliance officer (art. 32), and publish extensive transparency reports every six months (art. 33).  Finally, very large online platforms would be subject to “enhanced supervision” by DSCs (arts. 50-66).

II. Oversight and enforcement regime

The DSA Proposal also introduces an extensive oversight and enforcement regime for covered services.  In addition to requiring Member States to appoint a DSC, the Proposal would establish a European Board for Digital Services (“EBDS”), made up of all of the individual Member State DSCs (arts. 47-49).  The Proposal authorizes DSCs to impose fines of up to six percent of a service’s annual global turnover for violations of the Act (art. 42).

*  *  *  *  *

The DSA Proposal is the first step in the EU legislative process, which includes the European Parliament and the Council of the European Union, both of which can amend the Commission’s proposal.  The team at Covington will continue to monitor developments in this space and stand ready to answer any questions that companies may have.

Reach out to a member of our technology, privacy, antitrust and policy teams with questions.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lisa Peets Lisa Peets

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more…

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more than two decades, she has worked closely with many of the world’s best-known technology companies.

Lisa counsels clients on a range of EU and UK legal frameworks affecting technology providers, including data protection, content moderation, artificial intelligence, platform regulation, copyright, e-commerce and consumer protection, and the rapidly expanding universe of additional rules applicable to technology, data and online services.

Lisa also supports Covington’s disputes team in litigation involving technology providers.

According to Chambers UK (2024 edition), “Lisa provides an excellent service and familiarity with client needs.”

Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Mark Young Mark Young

Mark Young is an experienced tech regulatory lawyer and a vice-chair of Covington’s Data Privacy and Cybersecurity Practice Group. He advises major global companies on their most challenging data privacy compliance matters and investigations. Mark also leads on EMEA cybersecurity matters at the…

Mark Young is an experienced tech regulatory lawyer and a vice-chair of Covington’s Data Privacy and Cybersecurity Practice Group. He advises major global companies on their most challenging data privacy compliance matters and investigations. Mark also leads on EMEA cybersecurity matters at the firm. In these contexts, he has worked closely with some of the world’s leading technology and life sciences companies and other multinationals.

Mark has been recognized for several years in Chambers UK as “a trusted adviser – practical, results-oriented and an expert in the field;” “fast, thorough and responsive;” “extremely pragmatic in advice on risk;” “provides thoughtful, strategic guidance and is a pleasure to work with;” and has “great insight into the regulators.” According to the most recent edition (2024), “He’s extremely technologically sophisticated and advises on true issues of first impression, particularly in the field of AI.”

Drawing on over 15 years of experience, Mark specializes in:

  • Advising on potential exposure under GDPR and international data privacy laws in relation to innovative products and services that involve cutting-edge technology, e.g., AI, biometric data, and connected devices.
  • Providing practical guidance on novel uses of personal data, responding to individuals exercising rights, and data transfers, including advising on Binding Corporate Rules (BCRs) and compliance challenges following Brexit and Schrems II.
  • Helping clients respond to investigations by data protection regulators in the UK, EU and globally, and advising on potential follow-on litigation risks.
  • Counseling ad networks (demand and supply side), retailers, and other adtech companies on data privacy compliance relating to programmatic advertising, and providing strategic advice on complaints and claims in a range of jurisdictions.
  • Advising life sciences companies on industry-specific data privacy issues, including:
    • clinical trials and pharmacovigilance;
    • digital health products and services; and
    • engagement with healthcare professionals and marketing programs.
  • International conflict of law issues relating to white collar investigations and data privacy compliance (collecting data from employees and others, international transfers, etc.).
  • Advising various clients on the EU NIS2 Directive and UK NIS regulations and other cybersecurity-related regulations, particularly (i) cloud computing service providers, online marketplaces, social media networks, and other digital infrastructure and service providers, and (ii) medical device and pharma companies, and other manufacturers.
  • Helping a broad range of organizations prepare for and respond to cybersecurity incidents, including personal data breaches, IP and trade secret theft, ransomware, insider threats, supply chain incidents, and state-sponsored attacks. Mark’s incident response expertise includes:
    • supervising technical investigations and providing updates to company boards and leaders;
    • advising on PR and related legal risks following an incident;
    • engaging with law enforcement and government agencies; and
    • advising on notification obligations and other legal risks, and representing clients before regulators around the world.
  • Advising clients on risks and potential liabilities in relation to corporate transactions, especially involving companies that process significant volumes of personal data (e.g., in the adtech, digital identity/anti-fraud, and social network sectors.)
  • Providing strategic advice and advocacy on a range of UK and EU technology law reform issues including data privacy, cybersecurity, ecommerce, eID and trust services, and software-related proposals.
  • Representing clients in connection with references to the Court of Justice of the EU.
Photo of Marty Hansen Marty Hansen

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues, including related to artificial intelligence. Martin has…

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues, including related to artificial intelligence. Martin has extensive experience in advising clients on matters arising under EU and U.S. law, UK law, the World Trade Organization agreements, and other trade agreements.

Photo of Stacy Young Stacy Young

Stacy Young is an associate in the London office. She advises technology and life sciences companies across a range of privacy and regulatory issues spanning AI, clinical trials, data protection and cybersecurity.