Illegal Content (DSA)

Illegal content is, under the Digital Services Act (DSA), defined as any information that in itself or in relation to an activity is not in compliance with EU law or national law. The definition is broad and covers everything from terrorist propaganda and child sexual abuse material to product safety violations and copyright infringements.

Back to Dictionary

Table of Contents

    What is illegal content?

    The DSA defines illegal content in Article 3(1)(h) as any information that in itself or in relation to an activity, including the sale of products or provision of services, is not in compliance with Union law or the law of a member state.

    The definition is intentionally broad. It is not limited to specific types of content but covers all information that violates applicable legislation. What is illegal depends on the specific area of law: criminal law, consumer law, copyright law, product safety law, etc.

    It is important to distinguish between illegal content and harmful content. Content can be harmful (e.g. health misinformation) without being illegal. The DSA focuses on illegal content, though platforms may set rules for harmful content in their terms, which they moderate through content moderation.

    Types of illegal content

    Illegal content under the DSA spans many areas of law:

    **Criminal illegal content:** Terrorist propaganda, child sexual abuse material (CSAM), illegal hate speech, threats, stalking and harassment.

    **Consumer and trade law illegal content:** Sale of counterfeit products, products not meeting EU safety requirements, misleading advertisements, sale of illegal goods.

    **Intellectual property illegal content:** Copyright infringements (pirated music, film, software), trademark infringements.

    **Data protection illegal content:** Sharing of personal data without a lawful legal basis, doxing.

    Platform responsibility

    Intermediary services are, as a starting point, not responsible for user content provided they do not have knowledge of the illegal content. The exemption lapses if the service has actual knowledge of illegal activity and fails to act. The DSA prohibits general monitoring obligations.

    For very large online platforms, enhanced requirements apply. They must assess the risk of dissemination of illegal content in their annual risk assessments and take risk mitigation measures.

    Notice and action

    The DSA establishes a standardised notice-and-action mechanism. All hosting services must provide a mechanism for users to report illegal content. Notifications from trusted flaggers must be processed with priority. When a platform decides to remove content, it must inform the affected user with a reason. The user has the right to complain.

    Frequently Asked Questions about Illegal Content (DSA)

    What is illegal content under the DSA?

    Illegal content under the DSA is any form of information that violates EU law or a member state's national law. It covers broadly and includes terrorist propaganda, child sexual abuse material, illegal hate speech, product safety violations, copyright infringements and misleading advertisements.

    Must platforms actively search for illegal content?

    No. The DSA prohibits general monitoring obligations. Platforms are not required to actively monitor all content, but they must act expeditiously when they receive notifications or otherwise gain knowledge of illegal content.

    What is the difference between illegal content and harmful content?

    Illegal content is defined by the fact that it violates applicable legislation. Harmful content may be problematic but is not necessarily illegal. The DSA primarily regulates illegal content, whilst platforms may set their own policies for harmful content in their terms.

    What happens when a platform receives a notification about illegal content?

    The platform must assess the notification and make a decision. If the content is assessed as illegal, it must be removed or blocked expeditiously. The platform must inform the user who uploaded the content and provide access to a complaint mechanism.

    +400 companies use .legal
    Region Sjælland
    Aarhus Universitet
    aj_vaccines_logo
    Realdania
    Right People
    IO Gates
    PLO
    Finans Danmark
    geia-food
    Vestforbrænding
    Evida
    Klasselotteriet
    NRGI1
    BLUE WATER SHIPPING
    Karnov
    Ingvard Christensen
    VP Securities
    AH Industries
    Lægeforeningen
    InMobile
    AK Nygart
    ARP Hansen
    DEIF
    DMJX
    Axel logo
    qUINT Logo
    KAUFMANN (1)
    SMILfonden-logo
    kurhotel_skodsborg
    nemlig.com
    Molecule Consultancy
    Novicell