Conformity Assessment (AI)
A conformity assessment is the formal process by which a provider documents that a high-risk AI system meets all requirements of the AI Act. Without a valid assessment, the system may not be placed on the market or used in the EU.
Back to Dictionary- Dictionary
- Conformity Assessment (AI)
Table of Contents
What is a conformity assessment?
A conformity assessment is the process by which the provider of a high-risk AI system systematically verifies and documents that the system meets all relevant requirements of the AI Act.
The concept is well known from product safety legislation. Just as a manufacturer of medical devices must demonstrate that a product is safe before it may be sold, a provider of high-risk AI must demonstrate that the system is compliant before it may be used in the EU.
The conformity assessment is not a one-off event. It must be carried out before the system is placed on the market and repeated if the system is substantially modified. Think of it as an ongoing compliance obligation, similar to the documentation you know from ISMS under ISO 27001.
The procedure step by step
A conformity assessment for a high-risk AI system follows a structured process:
- 1. Quality management system: The provider must have a quality management system covering the entire AI system’s lifecycle. The system must document procedures for design, development, testing, monitoring and maintenance.
- 2. Technical documentation: All technical documentation must be prepared in accordance with the AI Act’s Annex IV. This includes the system’s purpose, architecture, training data, test results, performance and limitations.
- 3. Verification of requirements: The provider systematically verifies that each individual requirement of the AI Act is met. This includes risk assessment, data governance, transparency, human oversight, accuracy and cybersecurity.
- 4. Testing and validation: The system must be thoroughly tested to verify that it functions as documented. Testing must cover both normal operating conditions and edge cases.
- 5. EU declaration of conformity: The provider issues a formal declaration that the system meets the requirements. The declaration references the technical documentation and test results.
- 6. CE marking: The system is affixed with the CE marking, indicating that it has been assessed and found compliant.
- 7. Registration: The system is registered in the EU database for high-risk AI systems before it is placed on the market.
Internal vs. external assessment
For most high-risk AI systems, the provider may carry out the conformity assessment itself. This is called an internal assessment and is comparable to the self-assessment many know from GDPR’s data protection impact assessments (DPIA).
There is an important exception: AI systems for real-time and post-biometric remote identification of persons require an external assessment carried out by a notified body (an independent third party). This is due to the particular risk that biometric identification poses to citizens’ rights.
If a high-risk AI system is also regulated by other EU product legislation (e.g. as a medical device), the conformity assessment for the AI requirements must be integrated into the existing assessment procedure for the product. You do not need to carry out two separate assessments.
Regardless of whether the assessment is internal or external, all documentation must be retained for at least ten years and made available to supervisory authorities on request.
After the assessment
The conformity assessment is not the end point. The provider has an ongoing obligation to monitor the system after it has been placed on the market (post-market monitoring). This involves:
- Systematic collection and analysis of data on the system’s performance in operation.
- Assessment of whether new risks have arisen since the assessment was carried out.
- Reporting of serious incidents to the supervisory authorities.
- A new conformity assessment if the system is substantially modified.
A substantial modification is any change that goes beyond what the provider anticipated in the technical documentation. This may be changes to training data, the system’s purpose or the conditions under which it is used. The threshold is similar to the assessment you know from GDPR, where substantial changes in personal data processing require a new impact assessment.
Start preparations early. A thorough conformity assessment takes time, and the requirements for technical documentation are extensive. Organisations that already work in a structured manner with technical and organisational measures have a good starting point.
Frequently Asked Questions about Conformity Assessment (AI)
What is a conformity assessment in the AI Act?
It is the formal process by which the provider of a high-risk AI system documents that the system meets the AI Act’s requirements for safety, transparency, documentation and human oversight before it can be placed on the market in the EU.
Who must carry out a conformity assessment?
The provider of a high-risk AI system is responsible for carrying out the conformity assessment. For most systems, the provider may conduct the assessment itself (internal assessment). For certain biometric systems, an independent third party is required.
When must the conformity assessment be carried out?
The assessment must be carried out before the system is placed on the market or put into use. It must also be updated if the system is substantially modified. The requirement applies from August 2026 for most high-risk AI systems.
What happens if a conformity assessment is not carried out?
Placing on the market or using a high-risk AI system without a valid conformity assessment is a breach of the AI Act. This can result in fines of up to EUR 15 million or 3% of global annual turnover.
Related Terms
High-Risk AI System
An AI system used in critical areas that must meet strict requirements for safety, transparency and human oversight under the AI Act.
ai_actProvider (AI Act)
The party that develops or markets an AI system under its own name, bearing primary responsibility for compliance with the AI Act.
ai_actAI Act
The EU's comprehensive regulation on artificial intelligence, classifying AI systems by risk level and imposing requirements from development to deployment.
Related Articles
Info
.legal A/S
hello@dotlegal.com
+45 7027 0127
VAT-no: DK40888888
Support
support@dotlegal.com
+45 7027 0127
Need help?
Let me help you get started
+45 7027 0127 and I'll get you started
.legal is not a law firm and is therefore not under the supervision of the Bar Council.