Trusted Flagger
A trusted flagger is an organisation with special status under the Digital Services Act that can report illegal content with priority. Platforms are obliged to process reports from these organisations faster than ordinary notifications.
Back to Dictionary- Dictionary
- Trusted Flagger
Table of Contents
What is a trusted flagger?
The DSA introduces a scheme whereby specialised organisations can obtain trusted flagger status. These organisations have documented expertise in identifying illegal content within their field and receive a special role in the DSA system.
The scheme is built on a simple logic: a consumer organisation with expertise in product safety is better at identifying dangerous products on a marketplace than the average user. By granting these organisations priority, the DSA ensures that illegal content is identified and assessed more quickly.
It is the Digital Services Coordinator in each EU country that designates trusted flaggers. Designation follows an application, and the organisation must fulfil a number of conditions.
Requirements for obtaining status
DSA Article 22 sets out three fundamental conditions for becoming a trusted flagger:
- Expertise: The organisation must have particular expertise and competence to detect, identify and report illegal content within its field.
- Independence: The organisation must be independent of online platforms. A department within a platform company cannot obtain the status.
- Diligence and objectivity: The organisation must submit reports with diligence, accuracy and in good faith. Reports must be well-founded and objective.
The organisation must represent collective interests and not act on behalf of individual commercial interests. This ensures the scheme is used to protect users and society, not for competitive distortion.
The coordinator publishes a list of all trusted flaggers in the country. The lists are aggregated at EU level so platforms can verify the status of flaggers from all member states.
Platform obligations
When a trusted flagger submits a notification, the platform must process it with priority. In practice, this means the platform must:
- Set up dedicated channels or processes for notifications from trusted flaggers
- Process these notifications faster than ordinary reports
- Ensure the decision on the content is taken without undue delay
The platform must still make its own assessment. A notification from a trusted flagger does not automatically lead to removal. Content moderation remains the platform's responsibility, but the trusted flagger's expertise weighs heavily in the assessment.
Platforms that systematically ignore notifications from trusted flaggers risk sanctions from the Digital Services Coordinator. Conversely, a trusted flagger may lose its status if its reports repeatedly prove inaccurate.
Trusted flaggers in practice
In practice, trusted flaggers cover many different areas. Consumer protection organisations may focus on dangerous products or misleading advertisements. Child protection organisations may focus on harmful content targeting minors. Copyright organisations may focus on pirated content.
For very large online platforms, cooperation with trusted flaggers is part of the mandatory risk mitigation. These platforms must actively engage with trusted flaggers and ensure effective processes.
The scheme creates a bridge between civil society and the platforms. Instead of platforms assessing content alone, experts with specialist knowledge of specific types of illegal content are involved. This strengthens both algorithmic transparency and the quality of content moderation across the EU.
Frequently Asked Questions about Trusted Flagger
What is a trusted flagger under the DSA?
A trusted flagger is an organisation designated by a national Digital Services Coordinator. The organisation has demonstrated expertise in identifying illegal content, and its reports must be processed with priority by platforms.
How does one become a trusted flagger?
An organisation applies to the national Digital Services Coordinator. The applicant must document expertise and competence in the relevant content area, independence from platforms, and the capacity to submit reports with diligence and objectivity.
Must platforms always remove content reported by a trusted flagger?
No. Platforms must process reports from trusted flaggers with priority and more quickly, but they must still make their own assessment of whether the content is illegal. Automatic removal is not a requirement.
Can a trusted flagger's status be revoked?
Yes. The Digital Services Coordinator may revoke the status if the flagger no longer meets the conditions, e.g. due to repeated inaccurate reports or lack of independence.
Related Terms
Digital Services Act (DSA)
The Digital Services Act is the EU regulation governing digital intermediary services with requirements for content moderation, transparency and user rights.
digital_service_actIllegal Content (DSA)
Illegal content under the DSA is any information that violates EU law or a member state's national law, regardless of subject matter.
digital_service_actContent Moderation
Content moderation encompasses platforms' processes for identifying, assessing and acting on user-generated content under the DSA.
Related Articles
Info
.legal A/S
hello@dotlegal.com
+45 7027 0127
VAT-no: DK40888888
Support
support@dotlegal.com
+45 7027 0127
Need help?
Let me help you get started
+45 7027 0127 and I'll get you started
.legal is not a law firm and is therefore not under the supervision of the Bar Council.