EU AI Act Article 72 — Post-Market Monitoring
Article 72 is the EU AI Act's "you don't get to ship and forget" article. Every high-risk provider must establish a post-market monitoring (PMM) system feeding back into Article 9 risk management and Article 73 incident reporting. Without it, you can't even comply with Article 73's 15-day incident-report deadline.
What Article 72 requires
- 72(1) — Providers shall establish + document a post-market monitoring system proportionate to the nature of the AI technologies + risks of the high-risk system.
- 72(2) — PMM system shall actively + systematically collect, document, and analyse relevant data provided by deployers + collected through other sources on performance throughout system lifetime.
- 72(3) — PMM shall be based on a documented post-market monitoring plan. The Commission published an implementing act with a template plan (Article 72(3), implementing act expected Q3 2026).
How MEOK covers Article 72
- /transparency (£399-£1,499/mo) — continuous decision-trace logging is the PMM data layer (every decision = one data point with full context + signed cert).
- meok-attestation-verify — cryptographic verification of PMM checkpoints. Auditor curls verify URL, gets signed payload.
- meok-governance-engine-mcp — generates Article 72 PMM plan template aligned with the Commission Q3 2026 implementing act.
- /audit-prep-bundle (£4,950) — 14-day signed evidence pack covering Article 72 PMM + Article 9 RMS + Article 73 incident reporting flow.
Frequently asked
What does Article 72 actually require?
Providers of high-risk AI systems must establish a post-market monitoring system to collect, document, and analyze relevant data on system performance throughout the lifetime, allowing the provider to evaluate continuous compliance with Article 8 requirements (the high-risk obligations bundle). The PMM system must be proportionate to the risks of the system and based on a documented PMM plan.
What goes into the PMM plan?
Article 72(3): the PMM plan describes how the provider will systematically collect + analyze data on real-world performance, identify when retraining/recalibration is needed, evaluate emerging risks, and feed back into the Article 9 risk management system. The European Commission published a PMM template (Implementing Act, Q3 2026) — providers can use it or roll their own.
How does this connect to Article 73 incident reporting?
Article 72 PMM is the upstream pipeline; Article 73 incident reporting is the downstream regulator notification. PMM data flows into Article 9 RMS continuous risk re-evaluation, AND into Article 73 if a serious incident is detected. Without an Article 72 PMM in place, you literally cannot detect the incidents Article 73 says you must report within 15 days (or 2 days for life-threatening).
What metrics should we collect?
Drift detection (data distribution shift, prediction shift), accuracy on real-world data vs validation set, fairness metrics across demographic groups (Article 10), error rates by use case + user segment, downtime + degraded-mode incidents, security events flagged by Article 15(5) controls. Frequency: continuous for Tier-1 metrics, daily for Tier-2, weekly for Tier-3.
How does MEOK help?
meok-attestation-verify is the cryptographic backbone — every PMM check emits a signed cert your auditor can verify. /transparency £399/£1,499/mo gives you continuous decision-trace logging which is the granular PMM data layer. /audit-prep-bundle £4,950 wraps Article 72 PMM plan generation + Article 9 RMS + Article 73 incident reporting flow in a 14-day signed evidence pack.
Source: EU AI Act Regulation 2024/1689 Art. 72 · MEOK AI Labs · CSOAI LTD · UK Companies House 16939677