Content Moderation

Content moderation encompasses the processes a platform uses to identify, assess and act on user-generated content. Under the Digital Services Act (DSA), platforms must ensure transparent moderation processes, provide reasons for content removal and offer complaint mechanisms for users.

Back to Dictionary

Table of Contents

    What is content moderation?

    Content moderation covers all the activities a platform performs to handle user-generated content. This ranges from removing illegal content such as terrorist propaganda and child sexual abuse material to enforcing the platform's own terms regarding hate speech or misinformation.

    The DSA defines content moderation broadly. It encompasses not only removal of content but also restriction of visibility, addition of warnings or labels, suspension of user accounts and blocking of users.

    For platforms, moderation is a constant balancing act between protecting users against harmful content and respecting freedom of expression. The DSA seeks to strike this balance by placing requirements on the process, not on the specific outcome.

    DSA requirements for moderation

    The DSA imposes a range of obligations on platforms to ensure content moderation takes place transparently and fairly. These include transparent terms of service describing the moderation policy, a duty to provide reasons when removing content, notice-and-action mechanisms for reporting illegal content, and annual transparency reports. Reports from {a('/en/dictionary/trusted-flagger', 'trusted flaggers')} must be processed with priority.

    Automated moderation and human oversight

    With billions of posts daily on major platforms, automated moderation tools are unavoidable. The DSA acknowledges this but requires that platforms disclose the use of automated tools, that users affected by automated decisions are informed, and that there is human oversight. The DSA prohibits general monitoring obligations. It connects to algorithmic transparency.

    Complaint access and dispute resolution

    The DSA gives users the right to complain about moderation decisions. Online platforms must establish an internal complaint system. If the user is unsatisfied, the case can be brought before an independent dispute resolution body. Users may also complain to the Digital Services Coordinator.

    Frequently Asked Questions about Content Moderation

    What is content moderation under the DSA?

    Content moderation covers all the activities a platform performs to identify, assess and act on content from users. This includes removal of illegal content, restriction of visibility, suspension of accounts and enforcement of the platform's own terms.

    Must platforms manually review all content?

    No. The DSA prohibits general monitoring obligations. Platforms may use automated tools to identify potentially illegal content, but there must be human oversight of automated decisions, and users must have access to complaints.

    What happens if a platform removes my content?

    Under the DSA, the platform must give you a reason for the removal. You have the right to complain via the platform's internal complaint system. If you are unsatisfied, you can bring the case before an independent dispute resolution body.

    Can platforms freely determine their moderation policy?

    Platforms have the right to set their own terms, but the DSA requires that the terms are clear, consistently enforced, and that users have access to complaint processes. Platforms must not use dark patterns to manipulate users.

    What is the difference between illegal content and content that violates terms?

    Illegal content is content that violates EU or national legislation. Content that violates terms is content the platform has chosen to prohibit in its rules but which is not necessarily illegal. The DSA regulates both types, with a focus on illegal content.

    +400 companies use .legal
    Region Sjælland
    Aarhus Universitet
    aj_vaccines_logo
    Realdania
    Right People
    IO Gates
    PLO
    Finans Danmark
    geia-food
    Vestforbrænding
    Evida
    Klasselotteriet
    NRGI1
    BLUE WATER SHIPPING
    Karnov
    Ingvard Christensen
    VP Securities
    AH Industries
    Lægeforeningen
    InMobile
    AK Nygart
    ARP Hansen
    DEIF
    DMJX
    Axel logo
    qUINT Logo
    KAUFMANN (1)
    SMILfonden-logo
    kurhotel_skodsborg
    nemlig.com
    Molecule Consultancy
    Novicell