On February 16, 2024, the UK Information Commissioner’s Office (ICO) introduced specific guidance on content moderation and data protection. The guidance complements the Online Safety Act (OSA)—the UK’s legislation designed to ensure digital platforms mitigate illegal and harmful content. The ICO underlines that if an organisation carries out content moderation that involves personal information, “[it] must comply with data protection law.” The guidance highlights particular elements of data protection compliance that organisations should keep in mind, including in relation to establishing a legal basis and being transparent when moderating content, and complying with rules on automated decision-making. We summarize the key points below.
According to the guidance, the processing of personal information for content moderation purposes must be:
- Lawful. Organisations must identify a lawful basis before using personal information in their content moderation systems. The guidance states the bases most likely to be relevant to content moderation are “legal obligation” (i.e., the need to comply with a statutory obligations such as those in the OSA), and legitimate interests. Where content moderation involves special category data, the organisation must identify a condition for processing such information under Article 9 of the UK GDPR. Where the processing involves criminal offense information (and where the processing is not under the control of an official authority) the organisation must identify a condition for processing set out in Schedule 1 of the Data Protection Act 2018.
- Fair. Personal data processed in the context of content moderation must be conducted in a manner that people would reasonably expect, “perform accurately”, and “produce unbiased, consistent outputs.” The guidance recommends organisations conduct checks of their content moderation systems for accuracy and bias.
- Transparent. Organisations must provide various information, including: why they are using personal information; the lawful basis for processing; the type of personal information used; the decisions taken with the information and how this may impact use of the service; whether the organisation is keeping the personal information and for how long; whether the information is shared with other organisations; and how users may exercise their data protection rights. The guidance notes that the OSA also requires providers of regulated user-to-user services to disclose information about proactive technology used for complying with illegal content safety duties or the safety duties for the protection of children.
- Purpose-limited. Organisations must be clear about why they are using personal information for content moderation (for example, to comply with the OSA) and what they intend to do with the information.
- Subject to data minimization principles. The guidance notes that content moderation is “highly contextual” and may require information beyond the content that is the subject of the moderation, such as previous posts or records of interaction with the service. Whilst an organisation engaging in content moderation may take into account such information, organisations must be able to demonstrate that doing so is necessary to achieve their purpose, and that no less intrusive option is available.
- Accurate. Organisations must “take all reasonable steps” to ensure personal information used and generated through content moderation is correct and not misleading. Where an organisation determines a user has violated its policies, it must ensure any record on the user’s account is accurate and up-to-date.
- Secure. Organisations must put in place technical and organisational measures to ensure an appropriate level of data security. In the event that an organisation uses third-party moderators to engage in content moderation the organisation “must choose one that provides sufficient guarantees about its security measures.”
Organisations that use automated decision-making in content moderation must also determine whether that process involves “solely automated decisions that have legal or similarly significant effects on people” for the purposes of the Article 22 of the UK GDPR. The guidance provides several examples specific to content moderation systems to aid organisations in making this determination. Should the processing qualify as automated decision-making under Article 22, the guidance underscores such processing may only take place if it is authorised by domestic law, necessary for a contract, or based on a person’s explicit consent.
Finally, the guidance states that personal information obtained in the course of content moderation must not be kept for longer than necessary. According to the guidance, this includes keeping information “‘just in case’ it might be relevant to the future”. The guidance recommends that organisations have clear contractual agreements with third parties limiting the information that they maintain to what is necessary.
* * *
The Covington team regularly advises on privacy and content-moderation laws in the UK and the EU. The nexus between data privacy and content moderation is an evolving area, and we will continue to monitor developments. We are happy to answer any questions you may have on this topic.