Algorithmic Transparency

Algorithmic transparency is the requirement that digital platforms must make it visible to users how automated systems sort, prioritise and recommend content. Under the Digital Services Act (DSA), platforms must disclose the key parameters of their recommender systems and offer alternatives without profiling.

Back to Dictionary

Table of Contents

    What is algorithmic transparency?

    When you scroll through your social media feed, view search results on an online marketplace or receive recommendations on a streaming platform, it is algorithms that determine what you see. Algorithmic transparency is about making these invisible decision-making processes visible.

    The concept covers platforms' obligation to inform users about which parameters determine what they see. This may include your location, previous clicks, age or other data points. The point is that as a user you must be able to understand the logic behind the content presented to you.

    The transparency requirement goes beyond merely disclosing that algorithms are used. Platforms must explain the key parameters in language that is comprehensible to ordinary users. This places demands on how organisations communicate about technically complex systems.

    DSA requirements for recommender systems

    DSA Article 27 sets out specific requirements for platforms that use recommender systems. All online platforms must describe the key parameters their recommender systems use in their terms of service. This also includes disclosing options for changing or influencing these parameters.

    The requirements encompass three central elements:

    • Disclose parameters: Platforms must describe the most important criteria used to rank and recommend content. This applies to both parameters derived directly from user behaviour and parameters set by the platform itself.
    • Offer alternatives: Users must have at least one option for recommendations that are not based on profiling. This could be a chronological feed or a random sort.
    • Clear language: The information must be presented in clear and understandable language, not buried in lengthy legal documents.

    The requirement applies to all types of intermediary services that use recommender systems, regardless of size. However, the practical obligations are more extensive for large platforms.

    Enhanced requirements for very large platforms

    Very large online platforms (VLOPs) with over 45 million active users in the EU have enhanced obligations. They must conduct annual risk assessments of how their recommender systems may affect fundamental rights, public health and public discourse.

    These platforms must also undergo independent audits at least once a year. The auditor assesses whether the platform's recommender systems function as described and whether the risk mitigation measures are adequate.

    The Digital Services Coordinator in the country where the platform is established supervises compliance. The European Commission has direct supervisory competence over very large platforms.

    Risk assessments must specifically address whether algorithms amplify illegal content, misinformation or manipulation. This requires ongoing analysis of the systems' actual effects, not merely a theoretical review.

    Interaction with GDPR and personal data

    Algorithmic transparency under DSA supplements the rights you already have under GDPR. GDPR Article 22 gives you the right not to be subject to a decision based solely on automated processing. GDPR Articles 13-14 require disclosure about automated decision-making, including the underlying logic.

    DSA adds a new layer by requiring proactive disclosure to all users, not just those who specifically request it. Where GDPR focuses on protection of personal data, DSA focuses on transparency in the system's mode of operation.

    For organisations operating as data controllers, this means ensuring compliance with both regulatory frameworks. Consent to profiling under GDPR and transparency about recommender systems under DSA are two sides of the same coin.

    The technical and organisational measures that support GDPR compliance can often be reused in the DSA context. Good documentation of data flows and decision-making processes benefits both regulatory frameworks.

    Frequently Asked Questions about Algorithmic Transparency

    What is algorithmic transparency under the DSA?

    Algorithmic transparency is the requirement in the Digital Services Act that platforms must disclose to users the key parameters of their recommender systems, and offer the option of choosing alternatives that are not based on profiling.

    Which platforms must comply with the algorithmic transparency requirements?

    All online platforms that use recommender systems must comply. Very large online platforms (VLOPs) with over 45 million users in the EU have enhanced obligations, including independent auditing of their systems.

    How does algorithmic transparency relate to GDPR?

    GDPR already grants the right to information about automated decisions. DSA extends this by requiring platforms to proactively inform all users about recommender system parameters, not just those who request it.

    Can users opt out of algorithm-based recommendations?

    Yes. The DSA requires platforms to offer at least one option for recommendations that are not based on profiling of the user. This could be, for example, a chronological display instead of a personalised feed.

    +400 companies use .legal
    Region Sjælland
    Aarhus Universitet
    aj_vaccines_logo
    Realdania
    Right People
    IO Gates
    PLO
    Finans Danmark
    geia-food
    Vestforbrænding
    Evida
    Klasselotteriet
    NRGI1
    BLUE WATER SHIPPING
    Karnov
    Ingvard Christensen
    VP Securities
    AH Industries
    Lægeforeningen
    InMobile
    AK Nygart
    ARP Hansen
    DEIF
    DMJX
    Axel logo
    qUINT Logo
    KAUFMANN (1)
    SMILfonden-logo
    kurhotel_skodsborg
    nemlig.com
    Molecule Consultancy
    Novicell