The Commission and the European Board for Digital Services have announced the integration of the revised voluntary Code of conduct on countering illegal hate speech online + (“Code of Conduct+”) into the framework of the Digital Services Act (“DSA”). Article 45 of the DSA states that, where significant systemic risks emerge under Article 34(1) (concerning the obligation on very large online platforms (“VLOPs”) and very large online search engines (“VLOSEs”) to identify, analyse, and assess systemic risks), and concern several VLOPs or VLOSEs, the Commission may invite VLOPs and VLOSEs to participate in the drawing up of codes of conduct, including commitments to take risk mitigation measures and to report on those measures and their outcomes. The Code of Conduct+ was adopted in this context. VLOPs and VLOSEs’ adherence to the Code of Conduct+ may be considered as a risk mitigation measure under Article 35 DSA, but participation in and implementation of the Code of Conduct+ “should not in itself presume compliance with [the DSA]” (Recital 104).

The Code of Conduct+—which builds on the Commission’s original Code of Conduct on countering illegal hate speech online, published in 2016—seeks to strengthen how Signatories address content defined by EU and national laws as illegal hate speech. Adhering to the Code of Conduct+’s commitments will be part of the annual independent audit of VLOPs and VLOSEs required by the DSA (Art. 37(1)(b)), but smaller companies are free to sign up to the Code as well.

Key Commitments of the Signatories

 Signatories commit to:

  • Allowing a network of “Monitoring Reporters” to monitor how they review hate speech notices. Monitoring Reporters, which may include DSA “Trusted Flaggers”, are not-for-profit or public entities with expertise in illegal hate speech.
  • Reviewing at least half (and undertaking best efforts to review two-thirds) of hate speech notices from Monitoring Reporters within 24 hours.
  • Participating in an annual Monitoring Exercise that reviews adherence to specific commitments, the results of which will be published by the Commission. The methodology of the Monitoring Exercise, and the information that the Commission will include it its report, are set out in Annex 1 to the Code of Conduct+.
  • Providing a short summary of “information on the measures taken to address illegal hate speech as part of their content moderation policies” which will “accompany[] the[] results of each monitoring exercise”. The points signatories should include in the summary, where relevant, are set out in Annex 2 to the Code of Conduct+.
  • Participating in structured collaboration with experts and civil society organizations to observe trends and developments in hate speech.
  • Collaborating with civil society organizations to raise user awareness about illegal hate speech and the procedures for reporting such content.

The Commission press release cautions that “participating in and implementing a given code of conduct . . . does not in itself presume compliance with the DSA and is without prejudice to the Commission’s assessment [of a Signatory’s compliance with the DSA] on a case-by-case basis” (emphasis added). That said, the press release also notes that “online platforms who are designated under the DSA [as VLOPs] can adhere to the Code of [C]onduct+ to demonstrate their compliance with the DSA obligation to mitigate the risk of the dissemination of illegal content on their services”. The evaluation of whether platforms achieve the Code of Conduct+’s objectives “will [also] be part of the continuous monitoring of platforms’ compliance with existing rules.”

* * *

The Covington team regularly advises clients on their compliance with the DSA and other legislation affecting technology companies. Please reach out to a member of the team if you have any questions.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lisa Peets Lisa Peets

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more…

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more than two decades, she has worked closely with many of the world’s best-known technology companies.

Lisa counsels clients on a range of EU and UK legal frameworks affecting technology providers, including data protection, content moderation, artificial intelligence, platform regulation, copyright, e-commerce and consumer protection, and the rapidly expanding universe of additional rules applicable to technology, data and online services.

Lisa also supports Covington’s disputes team in litigation involving technology providers.

According to Chambers UK (2024 edition), “Lisa provides an excellent service and familiarity with client needs.”

Photo of Marty Hansen Marty Hansen

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues, including related to artificial intelligence. Martin has…

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues, including related to artificial intelligence. Martin has extensive experience in advising clients on matters arising under EU and U.S. law, UK law, the World Trade Organization agreements, and other trade agreements.

Photo of Madelaine Harrington Madelaine Harrington

Madelaine Harrington is an associate in the technology and media group. Her practice covers a wide range of regulatory and policy matters at the cross-section of privacy, content moderation, artificial intelligence, and free expression. Madelaine has deep experience with regulatory investigations, and has…

Madelaine Harrington is an associate in the technology and media group. Her practice covers a wide range of regulatory and policy matters at the cross-section of privacy, content moderation, artificial intelligence, and free expression. Madelaine has deep experience with regulatory investigations, and has counseled multi-national companies on complex cross-jurisdictional fact-gathering exercises and responses to alleged non-compliance. She routinely counsels clients on compliance within the EU regulatory framework, including the General Data Protection Regulation (GDPR), among other EU laws and legislative proposals.

Madelaine’s representative matters include:

coordinating responses to investigations into the handling of personal information under the GDPR,
counseling major technology companies on the use of artificial intelligence, specifically facial recognition technology in public spaces,
advising a major technology company on the legality of hacking defense tactics,
advising a content company on compliance obligations under the DSA, including rules regarding recommender systems.

Madelaine’s work has previously involved representing U.S.-based clients on a wide range of First Amendment issues, including defamation lawsuits, access to courts, and FOIA. She maintains an active pro-bono practice representing journalists with various news-gathering needs.