Conformity Assessment
The process by which a high-risk AI system is evaluated against regulatory requirements before being placed on the market. Under the EU AI Act, this may involve self-assessment by the provider or evaluation by an independent third-party body, depending on the system's use case.
Why It Matters
Conformity assessment is the gatekeeper between development and market access in the EU. Getting it wrong means your product cannot legally be sold or deployed in one of the world's largest markets.
Example
A company building an AI system for employee recruitment (a high-risk use case under the EU AI Act) must complete a conformity assessment demonstrating compliance with data governance, transparency, and human oversight requirements before deploying it in any EU member state.
Think of it like...
Conformity assessment is like a vehicle safety inspection — you can build whatever you want in the garage, but it doesn't hit the road until it passes the test.
Related Terms
CE Marking (AI)
The conformity marking required for high-risk AI systems placed on the EU market, indicating that the system has undergone the required conformity assessment and meets all applicable EU AI Act requirements. The CE mark must be visible, legible, and affixed before the system is made available.
EU AI Act
The European Union's comprehensive regulatory framework for artificial intelligence, establishing rules based on risk levels. It categorizes AI systems from minimal to unacceptable risk with corresponding compliance requirements.
EU Declaration of Conformity
A written declaration by the provider of a high-risk AI system stating that the system meets all applicable EU AI Act requirements. The declaration must be kept up to date, retained for at least 10 years after the system is placed on the market, and made available to national authorities on request.