Very Large Online Platform (VLOP)
A very large online platform (Very Large Online Platform, VLOP) is an online platform with at least 45 million active users in the EU per month. Under the Digital Services Act (DSA), these platforms have the most extensive obligations, including annual risk assessments, independent audits and direct supervision by the European Commission.
Back to Dictionary- Dictionary
- Very Large Online Platform (VLOP)
Table of Contents
What is a VLOP?
The DSA divides online platforms into categories based on their size and influence. The top of the hierarchy comprises very large online platforms (VLOPs) and very large online search engines (VLOSEs). The threshold is set at 45 million average active users in the EU per month, corresponding to approximately 10% of the EU population.
Platforms must self-report their user numbers at least once a year. The European Commission designates platforms as VLOPs. The first list was published in April 2023 and includes Facebook, Instagram, TikTok, YouTube, X, Amazon, LinkedIn, Snapchat, Pinterest and Booking.com.
Enhanced requirements for VLOPs
Beyond the requirements applying to all online platforms and intermediary services, VLOPs have additional obligations:
- Annual risk assessments: VLOPs must assess systemic risks from their services, including risks to fundamental rights, public health and public discourse.
- Risk mitigation measures: Based on the assessment, the platform must implement reasonable, proportionate and effective measures.
- Independent audit: An annual audit by an independent organisation.
- Data access for researchers: VLOPs must give recognised researchers access to data to study systemic risks.
- Semi-annual transparency reports: Instead of annual reports like other platforms.
- Crisis preparedness: VLOPs must have contingency plans for crises affecting public safety.
Risk assessments and audits
The risk assessment is the core of the VLOP regime. The platform must identify and analyse systemic risks including dissemination of illegal content, negative effects on fundamental rights, public discourse, elections and public safety, effects on public health, minors and gender-based violence. The assessment must evaluate how the platform's recommender systems and content moderation contribute to these risks.
Commission supervision
The European Commission has direct supervisory competence over VLOPs and VLOSEs. The Commission can conduct investigations, require information, carry out inspections and impose fines of up to 6% of global annual turnover. VLOPs must pay an annual supervisory fee to the Commission.
Frequently Asked Questions about Very Large Online Platform (VLOP)
What is a very large online platform (VLOP)?
A VLOP is an online platform with at least 45 million average active users in the EU per month. This threshold corresponds to approximately 10% of the EU population. VLOPs have enhanced obligations under the DSA.
Which platforms have been designated as VLOPs?
The European Commission has designated platforms such as Facebook, Instagram, TikTok, YouTube, X (Twitter), Amazon, LinkedIn, Snapchat, Pinterest, Booking.com, Google Maps, Google Play, App Store and others. The list is updated on an ongoing basis.
What are the additional requirements for VLOPs?
VLOPs must conduct annual risk assessments, implement risk mitigation measures, undergo independent audits, give researchers data access and have crisis preparedness. They are also subject to direct supervision by the European Commission.
Who supervises VLOPs?
The European Commission has direct supervisory competence over VLOPs and very large online search engines. The Commission can conduct investigations, require changes and impose fines. National coordinators may assist the Commission.
How is the 45 million user threshold calculated?
The threshold is calculated as the average number of active users in the EU per month over the preceding six months. Platforms must self-calculate and publish their user numbers at least once a year.
Related Terms
Digital Services Act (DSA)
The Digital Services Act is the EU regulation governing digital intermediary services with requirements for content moderation, transparency and user rights.
digital_service_actAlgorithmic Transparency
Algorithmic transparency is the requirement that digital platforms must disclose to users how their recommender systems and automated decisions function.
digital_service_actContent Moderation
Content moderation encompasses platforms' processes for identifying, assessing and acting on user-generated content under the DSA.
Related Articles
Info
.legal A/S
hello@dotlegal.com
+45 7027 0127
VAT-no: DK40888888
Support
support@dotlegal.com
+45 7027 0127
Need help?
Let me help you get started
+45 7027 0127 and I'll get you started
.legal is not a law firm and is therefore not under the supervision of the Bar Council.