General-Purpose AI Model (GPAI)

A general-purpose AI model is an AI model trained on large datasets that can perform many different tasks. The AI Act imposes specific requirements on providers of these models from August 2025.

Back to Dictionary

Table of Contents

    What is a general-purpose AI model?

    A general-purpose AI model (GPAI) is an AI model trained on large amounts of data that can be used for many different tasks. Think of models such as GPT-4, Llama, Gemini and Claude. They can generate text, write code, analyse images and solve complex tasks they were not specifically trained for.

    The central characteristic is versatility. Whereas a specific AI system is built for one task (e.g. detecting cancer cells in scans), a general-purpose AI model can be adapted for thousands of different uses. This makes them extremely useful but also difficult to regulate, because the provider cannot foresee all possible use cases.

    The AI Act distinguishes between the model itself (e.g. GPT-4) and the AI system built on the model (e.g. ChatGPT). A provider of a general-purpose AI model has its own obligations, independent of what downstream providers use the model for.

    Requirements for providers of general-purpose AI models

    From 2 August 2025, all providers of general-purpose AI models must meet a set of baseline requirements. These apply regardless of the model’s size or capacity.

    • Technical documentation: The provider must prepare and maintain technical documentation about the model, including the training process and evaluation results. The documentation must be made available to supervisory authorities and downstream providers.
    • Information to downstream providers: When a general-purpose AI model is integrated into an AI system by another provider, the model provider must supply sufficient information for the downstream provider to fulfil its own obligations.
    • Copyright policy: The provider must have a policy for compliance with EU copyright law, including text and data mining rules.
    • Training data summary: The provider must prepare a sufficiently detailed summary of the training data.

    Open-source models have lighter requirements. If the model is released under a free licence and its parameters are made publicly available, the provider is only obliged to comply with the copyright policy and training data summary requirements. This does not apply, however, if the model has systemic risk.

    Models with systemic risk

    General-purpose AI models with systemic risk are subject to enhanced requirements. A model is classified as having systemic risk if it has been trained with more than 10^25 FLOPS of computing power, or if the European Commission decides so on the basis of other criteria such as number of users, scope of output or market impact.

    The additional requirements include:

    • Model evaluation: The provider must conduct standardised evaluations of the model’s capabilities, including testing for risks related to cybersecurity, biological threats and other systemic risks.
    • Risk assessment and mitigation: A systemic risk assessment must be prepared together with a plan to mitigate those risks. This is similar to the risk assessment you know from ISMS work.
    • Incident reporting: Serious incidents must be reported to the EU’s AI Office and relevant national authorities.
    • Cybersecurity: The provider must ensure an adequate level of cybersecurity protection for the model and its physical infrastructure, including access control and encryption.

    The EU’s AI Office has direct supervisory authority over general-purpose AI models with systemic risk. This is one of the few situations in the AI Act where supervision occurs at EU level rather than national level.

    What does this mean in practice?

    If your organisation uses a general-purpose AI model (e.g. via an API from OpenAI or Google), you are typically not the provider of the model. You are either a deployer of an AI system or a provider of your own AI system built on the model.

    Your responsibility depends on how you use the model:

    • You use the model directly: You are responsible for using it in compliance with the AI Act and ensuring sufficient AI literacy among employees.
    • You build a product on top of the model: You become a provider of an AI system and are responsible for meeting the requirements for the risk level your system falls into.

    You should ensure that your model provider meets its obligations. Request technical documentation and information about the model’s limitations. This becomes part of your own compliance under the AI Act, just as you must ensure under GDPR that your data processors meet the requirements.

    Frequently Asked Questions about General-Purpose AI Model (GPAI)

    What is a general-purpose AI model?

    A general-purpose AI model (GPAI) is an AI model trained on large datasets that can perform many different tasks. Examples include GPT-4, Llama and Gemini. They can be used for text, code, images and much more.

    When do the rules for general-purpose AI models apply?

    The rules for general-purpose AI models apply from 2 August 2025. Providers of existing models have until this date to comply.

    What is the difference between a general-purpose AI model and an AI system?

    A general-purpose AI model is the underlying model itself (e.g. GPT-4), whilst an AI system is the finished product that uses the model (e.g. ChatGPT). The AI Act regulates both, but with different requirements.

    What is a general-purpose AI model with systemic risk?

    A general-purpose AI model with systemic risk is a model with particularly large capacity, typically trained with more than 10^25 FLOPS of computing power. These models face enhanced requirements for safety testing and risk management.

    +400 companies use .legal
    Region Sjælland
    Aarhus Universitet
    aj_vaccines_logo
    Realdania
    Right People
    IO Gates
    PLO
    Finans Danmark
    geia-food
    Vestforbrænding
    Evida
    Klasselotteriet
    NRGI1
    BLUE WATER SHIPPING
    Karnov
    Ingvard Christensen
    VP Securities
    AH Industries
    Lægeforeningen
    InMobile
    AK Nygart
    ARP Hansen
    DEIF
    DMJX
    Axel logo
    qUINT Logo
    KAUFMANN (1)
    SMILfonden-logo
    kurhotel_skodsborg
    nemlig.com
    Molecule Consultancy
    Novicell