EU AI Act Deployer Obligations Checklist XLSX
Professional Excel worksheet with Start Here, Dashboard, obligations checklist, lookups, sources, and review notes.
Download XLSX worksheet →Deployer obligations checklist Markdown
Plain Markdown checklist for internal notes and review.
Download Markdown →Checklist
Use this checklist after you confirm that your organization is using or deploying an AI system. For high-risk systems, a deployer file can’t stop at “we have a policy.” It needs owners, instructions, oversight, input-data checks, notices, logs where you control them, and escalation paths.
| Checklist item | Evidence to retain | Reference |
|---|---|---|
| Confirm role | Record whether your organization is the deployer, provider, or potentially both for each AI system. | Article 3, Article 25, Article 26 |
| Confirm high-risk status | Check whether the system is high-risk under Article 6 and Annex III before relying on a light-touch process. | Article 6, Annex III |
| Use according to instructions | Retain the provider instructions and map how your operating procedure follows them. | Article 26 |
| Assign human oversight | Name the role responsible for supervision, intervention, override, and escalation. | Article 26, Article 14 |
| Control input data | Document how deployer-controlled input data is relevant and sufficiently representative for the intended purpose. | Article 26(4) |
| Keep logs where under your control | Preserve system logs or usage records where the deployer has control of those logs. | Article 26 |
| Complete FRIA where triggered | Assess whether Article 27 applies and retain the impact assessment before first use where required. | Article 27 |
| Provide transparency notices | Check if Article 50 notice or labelling obligations apply to your use case. | Article 50 |
| Monitor operation | Define how users report anomalies, performance concerns, rights impacts, and vendor incidents. | Article 26, Article 72/73 escalation context |
| Retain evidence | Keep the system inventory, classification rationale, provider documents, oversight records, training records, and review dates. | Evidence management |
Practical order of work
- Confirm the system and your role first. Don’t build controls around a use case nobody has classified.
- Run the deployer obligation assessment.
- Collect provider instructions and system documentation before writing local procedures.
- Name the oversight owner and the escalation path. “The business owns it” is not enough.
- Create one evidence folder per high-risk or potentially high-risk system.
FAQ
Is every AI user a deployer?
Not automatically. The role depends on the system, the use case, the organization, and the legal definitions. Treat this as an evidence starter, not a legal conclusion.
Can this replace legal review?
No. It helps teams collect the facts and records legal or compliance reviewers need.
Related EU AI Compass assets
Source and review note
This page is an educational evidence starter. It is not legal advice and does not confirm compliance. Review the official text of Regulation (EU) 2024/1689 and obtain qualified legal review before relying on any template for a formal compliance decision.
- Regulation (EU) 2024/1689 on EUR-Lex
- Last reviewed against official source references: 27 April 2026.