Provider (AI Act)

A provider is the organisation or person that develops an AI system or a general-purpose AI model for the purpose of placing it on the market under its own name. The provider bears the primary responsibility for ensuring the system complies with the AI Act.

Back to Dictionary

Table of Contents

    What is a provider?

    A provider in the AI Act’s sense is a natural or legal person that develops an AI system or a general-purpose AI model, or has an AI system developed, for the purpose of placing it on the market or putting it into service under its own name or trademark. This applies regardless of whether it is done for payment or free of charge.

    What matters is not who wrote the code. If you commission an AI system from a consultant and launch it under your company name, you are the provider. If you purchase a general-purpose AI model and build a product on top of it that you sell onward, you are the provider of the combined AI system.

    The role is somewhat comparable to the data controller in GDPR. Just as the data controller bears overarching responsibility for the processing of personal data, the provider bears overarching responsibility for ensuring the AI system is safe and compliant.

    The provider’s obligations

    The provider’s obligations depend on whether the AI system is classified as high-risk or not. For high-risk AI systems, the requirements are most extensive:

    • Quality management system: Establish and maintain a quality management system covering the entire AI system’s lifecycle. The system must document procedures for design, development, testing and monitoring.
    • Technical documentation: Prepare and update technical documentation enabling authorities to assess the system’s compliance with the requirements.
    • Conformity assessment: Carry out a formal conformity assessment verifying that the system meets all relevant requirements before it is placed on the market.
    • Registration: Register the system in the EU database for high-risk AI systems.
    • Logging: Ensure automatic logging of the system’s operation so that decisions can be traced.
    • Human oversight: Design the system so that humans can monitor and override it. See human oversight of AI.
    • Post-market monitoring: Monitor the system’s performance after it has been placed on the market and respond to new risks.
    • Incident reporting: Report serious incidents to supervisory authorities within 15 days.

    For AI systems with limited risk (e.g. chatbots), the provider’s main obligation is to ensure that users are informed they are interacting with AI and that synthetic content is correctly labelled.

    Other roles in the AI Act

    The AI Act defines several roles beyond the provider:

    • Deployer: The organisation that uses an AI system under its own authority. The deployer is responsible for using the system correctly, ensuring human oversight and reporting faults to the provider. It is not the end user (the person who types into the system), but the organisation that has decided to deploy the system.
    • Importer: The party that places an AI system from a third country on the EU market. The importer must ensure that the provider has carried out the necessary conformity assessment.
    • Distributor: The party that makes an AI system available on the EU market without modifying it. The distributor must verify that the system has CE marking and is correctly registered.
    • Authorised representative: A person in the EU designated by a provider outside the EU to carry out provider obligations in the EU.

    The role distribution ensures that there is always a responsible party in the EU. If a provider from a third country does not have an authorised representative in the EU, the importer is liable for the provider’s obligations.

    When you change roles

    A central aspect of the AI Act is that you can change roles. A deployer can become a provider if they:

    • Put their own name or trademark on an AI system that has already been placed on the market.
    • Make a substantial modification to a high-risk AI system.
    • Change the system’s intended purpose so that it becomes high-risk.

    This means in practice that if you purchase an AI system and adapt it significantly for your own use, you assume the provider’s full responsibilities. You must carry out a new conformity assessment and fulfil all provider obligations.

    This is one of the most underestimated consequences of the AI Act. Many organisations fine-tune AI models or build new products on top of existing systems without considering that they may thereby become providers. Map your AI systems, identify your role for each system, and ensure you comply with the obligations that follow from that role.

    The comparison with GDPR is relevant: just as an organisation can shift from data processor to data controller if it begins to determine the purposes and means of data processing, an AI deployer can become a provider if it takes ownership of the system.

    Frequently Asked Questions about Provider (AI Act)

    Who is a provider in the AI Act?

    A provider is the person or organisation that develops an AI system or has it developed for the purpose of placing it on the market or putting it into service under its own name. This applies to organisations both inside and outside the EU if the system is used in the EU.

    What is the difference between a provider and a deployer in the AI Act?

    The provider develops or markets the AI system and has the primary responsibility for compliance. The deployer uses the system and has responsibility for using it correctly, monitoring its operation and ensuring human oversight.

    Can I become a provider without knowing it?

    Yes. If you change an AI system’s purpose, put your name on it or make substantial modifications, you may be regarded as a provider with the associated obligations, even if you did not develop the system yourself.

    What are the provider’s most important obligations?

    For high-risk AI systems, the provider must establish a quality management system, prepare technical documentation, carry out a conformity assessment, ensure human oversight and monitor the system after it has been placed on the market.

    +400 companies use .legal
    Region Sjælland
    Aarhus Universitet
    aj_vaccines_logo
    Realdania
    Right People
    IO Gates
    PLO
    Finans Danmark
    geia-food
    Vestforbrænding
    Evida
    Klasselotteriet
    NRGI1
    BLUE WATER SHIPPING
    Karnov
    Ingvard Christensen
    VP Securities
    AH Industries
    Lægeforeningen
    InMobile
    AK Nygart
    ARP Hansen
    DEIF
    DMJX
    Axel logo
    qUINT Logo
    KAUFMANN (1)
    SMILfonden-logo
    kurhotel_skodsborg
    nemlig.com
    Molecule Consultancy
    Novicell