On December 15, 2020, the European Commission published its proposed Regulation on a Single Market for Digital Services, more commonly known as the Digital Services Act (“DSA Proposal”). In publishing the Proposal, the Commission noted that its goal was to protect consumers and their fundamental rights online, establish an accountability framework for online services, and foster innovation, growth and competitiveness in the single market. On the same day, the Commission also published its proposal for a Digital Markets Act (“DMA”), which would impose new obligations and restrictions on online services that act as “designated gatekeepers” (see our analysis of the DMA Proposal here).
I. Obligations on different categories of digital service providers
The DSA Proposal would impose new obligations on four distinct categories of online services, according to their role, size, and impact. The categories are:
- Intermediary services, defined as “hosting”, “mere conduit” and “caching” services (art. 2(f));
- Hosting services, defined as a “service that consists of the storage of information provided by, and at the request of, a recipient of the service” (art. 2(f));
- Online platforms, defined as a hosting service that, “at the request of a recipient of the service, stores and disseminates [information] to the public” (art. 2(h)); and
- “Very large” online platforms, defined as “online platforms which provide their services to a number of average monthly active recipients . . . in the Union equal to or higher than 45 million” (art. 25(1)-(4)).
Each category of services is a subset of the one that precedes it and must comply with all the obligations that apply to it and those that apply to the preceding categories. Thus, for instance, “online platforms” must also comply with the obligations that apply to “hosting services” and to “intermediary services.”
As discussed in Part II, providers’ compliance with their DSA obligations would be overseen and enforced by new national “Digital Services Coordinators” (“DSCs”) in each Member State. Although the details of these obligations are too extensive to list here, they include the following:
A. Obligations on intermediary services
The DSA largely restates the safe harbors from liability for intermediary services—namely, caching services, mere conduits, and hosting services—set out in Articles 12-14 of the EU E-Commerce Directive, 2000/31/EC. However, under the DSA, the safe harbor for hosting services will not apply to online platforms that allow consumers to engage in an e-commerce transaction where the consumer reasonably believes that the product, information, or service at issue is being offered by the platform itself, or a trader under its control (art. 5).
In addition, intermediary service providers would need to comply with Member State judicial and administrative orders to remove illegal content, and to provide information about specific users of the service (arts. 8 and 9). These providers would also need to disclose, in their terms of use, any restrictions they impose on user content (e.g., offensive or harmful but legal content), and any “policies, procedures, measures and tools” they use for content moderation (art. 12). Finally, intermediary service providers would need to publish annual reports with data on aspects of their content moderation activities, including the number of orders they received and how they responded (art. 13).
B. Obligations on hosting services
Hosting services would need to do essentially two things (in addition to complying with the obligations on intermediary services). First, they would need to establish a mechanism through which users could notify them of suspected illegal content on their services (art. 14). Second, they would need to inform users when they remove their content, and must give such users a statement of reasons for the decision. Providers also would need to publish their content removal decisions, along with the statement of reasons, in a publicly accessible database managed by the Commission (art. 15).
C. Obligations on online platforms
The DSA Proposal sets out significantly more obligations on “online platforms”—which, again, the DSA defines as a hosting service that, “at the request of a recipient of the service, stores and disseminates [information] to the public” (art. 2(h)). These include obligations to:
- Establish an internal complaint-handling procedure for users whose content has been removed, including where their service or account has been suspended or terminated (art. 17).
- Allow users to challenge such decisions before an out-of-court dispute settlement body that has been certified by a DSC (art. 18).
- Give priority review to notices of illegal content issued by “trusted flaggers”—entities designated by DSCs based on their particular expertise to detect and identify illegal content (art. 19).
- Suspend users that frequently post “manifestly illegal content,” and individuals that frequently submit “manifestly unfounded” notices or complaints (art. 20).
- Notify national law enforcement or judicial authorities where the platform suspects that a crime “involving a threat to the life or safety of persons” has taken or will take place (art. 21).
- Demand identity-verifying information from traders that offer goods or services on the platform, and suspend traders that fail to provide the information (art. 22)
- Provide basic transparency about advertisements that appear on the platform, including the identity of the person or entity on whose behalf the ad is displayed (art. 24)
- Provide a more enhanced version of the annual transparency reports required by article 13, as well as filing reports every six months with their respective DSC on the average monthly active recipients of the service in each member state (art. 23)
D. Obligations on very large platforms
The DSA Proposal reserves the most onerous obligations for “very large online platforms”—i.e., those with over 45 million average monthly active users in the EU. These platforms would need to conduct risk assessments relating to illegal content on their services (art. 26); implement mitigating measures for any risks identified (art. 27); arrange for annual independent audits on their compliance with the DSA (art. 28); describe the “main parameters” of any “recommender systems” they use with respect content on their services (art. 29), and provide enhanced transparency about ads appearing on their services (art. 30).
Very large online platforms would also need to provide their DSC or the Commission with access to internal data (art. 31), assign an internal compliance officer (art. 32), and publish extensive transparency reports every six months (art. 33). Finally, very large online platforms would be subject to “enhanced supervision” by DSCs (arts. 50-66).
II. Oversight and enforcement regime
The DSA Proposal also introduces an extensive oversight and enforcement regime for covered services. In addition to requiring Member States to appoint a DSC, the Proposal would establish a European Board for Digital Services (“EBDS”), made up of all of the individual Member State DSCs (arts. 47-49). The Proposal authorizes DSCs to impose fines of up to six percent of a service’s annual global turnover for violations of the Act (art. 42).
* * * * *
The DSA Proposal is the first step in the EU legislative process, which includes the European Parliament and the Council of the European Union, both of which can amend the Commission’s proposal. The team at Covington will continue to monitor developments in this space and stand ready to answer any questions that companies may have.
Reach out to a member of our technology, privacy, antitrust and policy teams with questions.