with ISO/IEC 42001 & NIST AI RMF alignment
Not legal advice. Always verify against the official text and your counsel.
1) What applies when (2024–2027)
-
Act entered into force: 1 Aug 2024 (OJEU publication 12 Jul 2024). EUR-LexDijital Strateji
-
Bans (“unacceptable risk”) + AI literacy duties: 2 Feb 2025. Avrupa ParlamentosuDijital Strateji
-
GPAI model obligations & governance rules: 2 Aug 2025 (with Commission guidance + voluntary Code of Practice). Full enforcement powers for GPAI begin 2 Aug 2026; pre-existing GPAI models must comply by 2 Aug 2027. Dijital Strateji+2Dijital Strateji+2Arnold & Porter
-
Most “high-risk” system obligations: 2 Aug 2026, with embedded-product high-risk cases extended to 2 Aug 2027. Dijital Strateji
2) First decision: who are you under the Act?
-
Provider: puts an AI system on the EU market or puts it into service under their name/brand.
-
Deployer: uses an AI system under its authority (e.g., an employer using a hiring tool).
-
Importer/Distributor: places systems from outside the EU on the market / makes them available.
These roles determine your duties throughout the Act. (See Regulation (EU) 2024/1689.) EUR-Lex
3) Are you “high-risk”?
Two main paths to high-risk status:
-
AI that is a safety component of regulated products (Annex I sectors), or
-
AI used in Annex III use cases (e.g., biometrics, critical infrastructure, education testing, employment screening, essential services, law enforcement, migration, justice). Some Annex III uses can be de-scoped if they present no significant risk per Article 6(3)—you must document that assessment. artificialintelligenceact.eu+1wilmerhale.com
Hidden gotcha: If you conclude your system is not high-risk under Article 6(3), you still have a registration duty (see §6). EUR-Lex
4) If you’re high-risk: your Quality Management System (QMS)
Article 17 requires providers of high-risk AI to implement a documented QMS that covers (non-exhaustive): compliance strategy & conformity assessment, data governance, risk management, technical documentation, record-keeping & logs, human oversight, accuracy/robustness/cybersecurity, and post-market monitoring. This underpins CE-marking after assessment. artificialintelligenceact.euEUR-Lexpolicyreview.info
Provider essentials (high-risk):
-
Risk management lifecycle (Article 9).
-
Data & data governance (Article 10).
-
Technical documentation (Article 11).
-
Logging / record-keeping (Article 12).
-
Transparency to deployers + human oversight (Articles 13–14).
-
Accuracy, robustness, cybersecurity (Article 15).
-
Post-market monitoring & serious-incident reporting (Articles 72–73). EUR-Lex
5) If you’re not high-risk: what still applies?
-
General transparency duties (e.g., disclose when users interact with AI; label synthetic media/deepfakes; special care for emotion recognition/biometrics—subject to bans/limits).
-
GPAI (foundation-model) obligations if you provide a model, even if not a “system”. See §7. EUR-LexDijital Strateji
6) EU Database registration (often missed)
Before placing Annex III high-risk systems on the market (with some exceptions), providers must register in the EU database (Article 71).
And if you classify a system as non-high-risk under Article 6(3), you must register that too (Article 49(2)). EUR-Lex
7) GPAI (General-Purpose AI) obligations (2025–2027)
From 2 Aug 2025, GPAI model providers face transparency & safety duties (e.g., technical documentation, training-data summaries, copyright safeguards; additional testing/incident reporting for systemic-risk models). The GPAI Code of Practice (published 10 Jul 2025) and Commission guidelines (18 Jul 2025) are key implementation aids; adherence to the Code can help demonstrate compliance ahead of harmonized standards. Full enforcement for GPAI begins 2 Aug 2026; legacy models must comply by 2 Aug 2027. Dijital Strateji+1skadden.comArnold & Porter
8) Using ISO/IEC 42001 to structure your program
ISO/IEC 42001:2023 is the first AI management system standard (AIMS). Adopting it gives SMEs a repeatable governance backbone (policy→risk→controls→monitoring→improvement) aligned with ethical, transparency, and risk objectives. Note: It does not automatically create presumption of conformity under the AI Act unless adopted as a harmonized European standard (EN) and cited in the OJEU. CEN/CENELEC’s JTC 21 is actively developing AI Act-supporting standards and considering adoptions. ISO+1CEN-CENELECAI Watchjtc21.eu
Practical takeaway: Implement ISO/IEC 42001 processes now; once AI-Act harmonized standards land, you’ll only need to bridge the gaps.
9) NIST AI RMF (GOV–MAP–MEASURE–MANAGE): quick cross-walk
The NIST AI RMF 1.0 is voluntary but widely used. NIST also curates crosswalks to other frameworks—use these to justify your control mapping. NISTNIST YayınlarıNIST AI Resource Center
Indicative mapping (helpful for auditors):
-
GOV → AI Act QMS (Art 17), roles & accountability, policies, training, supplier governance.
-
MAP → Risk management (Art 9), data governance (Art 10), intended purpose & context definition (impacts high-risk scoping under Art 6).
-
MEASURE → Accuracy/robustness/cybersecurity (Art 15) metrics, bias/ performance evaluation, red-teaming for GPAI/systemic-risk.
-
MANAGE → Post-market monitoring & incident reporting (Arts 72–73), change management, CAPA, periodic re-assessment.
(Use this as an internal crosswalk, then cite your sources of truth.)
10) A 90-day action plan for SMEs
Days 1–15 — Scoping & roles
-
Inventory AI systems and models (in-house and vendor).
-
Decide: provider vs deployer vs distributor/importer for each item.
-
Flag anything that hits Annex III or acts as a safety component. artificialintelligenceact.eu
Days 16–30 — Risk & governance
-
Stand up an AI QMS skeleton (policy set, RACI, training, supplier terms, model registry).
-
Start risk management records per Article 9 (templates + approvals). EUR-Lex
Days 31–60 — Documentation & controls
-
Produce technical documentation packs (intended purpose, data lineage, testing, metrics).
-
Implement human oversight and user transparency measures.
-
Draft post-market monitoring and serious-incident SOPs. EUR-Lex
Days 61–90 — Registration & readiness
-
Register eligible high-risk systems and any Art 6(3) non-high-risk decisions in the EU database (where required).
-
For GPAI providers: align to the GPAI Code + Guidelines (doc sets, testing plans, copyright measures); plan 2026 enforcement uplift. EUR-LexDijital Strateji+1
11) Compliance checklists you can reuse
Provider (high-risk) mini-checklist