Content Moderation
Content moderation encompasses the processes a platform uses to identify, assess and act on user-generated content. Under the Digital Services Act (DSA), platforms must ensure transparent moderation processes, provide reasons for content removal and offer complaint mechanisms for users.
Back to Dictionary- Dictionary
- Content Moderation
Table of Contents
What is content moderation?
Content moderation covers all the activities a platform performs to handle user-generated content. This ranges from removing illegal content such as terrorist propaganda and child sexual abuse material to enforcing the platform's own terms regarding hate speech or misinformation.
The DSA defines content moderation broadly. It encompasses not only removal of content but also restriction of visibility, addition of warnings or labels, suspension of user accounts and blocking of users.
For platforms, moderation is a constant balancing act between protecting users against harmful content and respecting freedom of expression. The DSA seeks to strike this balance by placing requirements on the process, not on the specific outcome.
DSA requirements for moderation
The DSA imposes a range of obligations on platforms to ensure content moderation takes place transparently and fairly. These include transparent terms of service describing the moderation policy, a duty to provide reasons when removing content, notice-and-action mechanisms for reporting illegal content, and annual transparency reports. Reports from {a('/en/dictionary/trusted-flagger', 'trusted flaggers')} must be processed with priority.
Automated moderation and human oversight
With billions of posts daily on major platforms, automated moderation tools are unavoidable. The DSA acknowledges this but requires that platforms disclose the use of automated tools, that users affected by automated decisions are informed, and that there is human oversight. The DSA prohibits general monitoring obligations. It connects to algorithmic transparency.
Complaint access and dispute resolution
The DSA gives users the right to complain about moderation decisions. Online platforms must establish an internal complaint system. If the user is unsatisfied, the case can be brought before an independent dispute resolution body. Users may also complain to the Digital Services Coordinator.
Frequently Asked Questions about Content Moderation
What is content moderation under the DSA?
Content moderation covers all the activities a platform performs to identify, assess and act on content from users. This includes removal of illegal content, restriction of visibility, suspension of accounts and enforcement of the platform's own terms.
Must platforms manually review all content?
No. The DSA prohibits general monitoring obligations. Platforms may use automated tools to identify potentially illegal content, but there must be human oversight of automated decisions, and users must have access to complaints.
What happens if a platform removes my content?
Under the DSA, the platform must give you a reason for the removal. You have the right to complain via the platform's internal complaint system. If you are unsatisfied, you can bring the case before an independent dispute resolution body.
Can platforms freely determine their moderation policy?
Platforms have the right to set their own terms, but the DSA requires that the terms are clear, consistently enforced, and that users have access to complaint processes. Platforms must not use dark patterns to manipulate users.
What is the difference between illegal content and content that violates terms?
Illegal content is content that violates EU or national legislation. Content that violates terms is content the platform has chosen to prohibit in its rules but which is not necessarily illegal. The DSA regulates both types, with a focus on illegal content.
Related Terms
Digital Services Act (DSA)
The Digital Services Act is the EU regulation governing digital intermediary services with requirements for content moderation, transparency and user rights.
digital_service_actAlgorithmic Transparency
Algorithmic transparency is the requirement that digital platforms must disclose to users how their recommender systems and automated decisions function.
digital_service_actIllegal Content (DSA)
Illegal content under the DSA is any information that violates EU law or a member state's national law, regardless of subject matter.
Related Articles
Info
.legal A/S
hello@dotlegal.com
+45 7027 0127
VAT-no: DK40888888
Support
support@dotlegal.com
+45 7027 0127
Need help?
Let me help you get started
+45 7027 0127 and I'll get you started
.legal is not a law firm and is therefore not under the supervision of the Bar Council.