EU AI Act high-risk obligations begin in X days - start your readiness check EU AI Act deadline in X days - check readiness
Evidence starter

EU AI Act Deployer Obligations Checklist

Use this checklist to turn Article 26 into records your team can actually collect: role confirmation, provider instructions, oversight owners, input data checks, logs where controlled, FRIA triggers, Article 50 notices, monitoring, and retention.

Free · No login · Local download · Last reviewed 2026-04-27

EU AI Act Deployer Obligations Checklist XLSX

Professional Excel worksheet with Start Here, Dashboard, obligations checklist, lookups, sources, and review notes.

Download XLSX worksheet →

Deployer obligations checklist Markdown

Plain Markdown checklist for internal notes and review.

Download Markdown →

Checklist

Use this checklist after you confirm that your organization is using or deploying an AI system. For high-risk systems, a deployer file can’t stop at “we have a policy.” It needs owners, instructions, oversight, input-data checks, notices, logs where you control them, and escalation paths.

Checklist itemEvidence to retainReference
Confirm roleRecord whether your organization is the deployer, provider, or potentially both for each AI system.Article 3, Article 25, Article 26
Confirm high-risk statusCheck whether the system is high-risk under Article 6 and Annex III before relying on a light-touch process.Article 6, Annex III
Use according to instructionsRetain the provider instructions and map how your operating procedure follows them.Article 26
Assign human oversightName the role responsible for supervision, intervention, override, and escalation.Article 26, Article 14
Control input dataDocument how deployer-controlled input data is relevant and sufficiently representative for the intended purpose.Article 26(4)
Keep logs where under your controlPreserve system logs or usage records where the deployer has control of those logs.Article 26
Complete FRIA where triggeredAssess whether Article 27 applies and retain the impact assessment before first use where required.Article 27
Provide transparency noticesCheck if Article 50 notice or labelling obligations apply to your use case.Article 50
Monitor operationDefine how users report anomalies, performance concerns, rights impacts, and vendor incidents.Article 26, Article 72/73 escalation context
Retain evidenceKeep the system inventory, classification rationale, provider documents, oversight records, training records, and review dates.Evidence management

Practical order of work

  1. Confirm the system and your role first. Don’t build controls around a use case nobody has classified.
  2. Run the deployer obligation assessment.
  3. Collect provider instructions and system documentation before writing local procedures.
  4. Name the oversight owner and the escalation path. “The business owns it” is not enough.
  5. Create one evidence folder per high-risk or potentially high-risk system.

FAQ

Is every AI user a deployer?

Not automatically. The role depends on the system, the use case, the organization, and the legal definitions. Treat this as an evidence starter, not a legal conclusion.

Can this replace legal review?

No. It helps teams collect the facts and records legal or compliance reviewers need.

Related EU AI Compass assets

Source and review note

This page is an educational evidence starter. It is not legal advice and does not confirm compliance. Review the official text of Regulation (EU) 2024/1689 and obtain qualified legal review before relying on any template for a formal compliance decision.