Back to Resources
Compliance

EU AI Act Deployer Obligations: What You Need to Know

ELSA AI Team
October 2025
8 min read

GenAI Assure™ maps directly to Article 26 (deployer duties) for organisations that use third‑party AI tools. The Framework focuses on practical controls and audit evidence so deployers can meet their obligations without building models or changing providers.

What Article 26 Requires of Deployers

The Framework's checklist for deployers requires you to:

  • Use AI systems in line with the provider's instructions.
  • Keep comprehensive usage logs.
  • Ensure human oversight where required.
  • Monitor system operation and performance.
  • Implement robust data governance.
  • Correct misuse and address identified risks.
  • Co‑operate with providers and authorities.
  • Place appropriate transparency notices.

For high‑risk uses, ensure trained staff, defined oversight procedures, and log retention that meets provider and legal requirements.

How GenAI Assure™ Implements These Duties

Comprehensive usage logging

Detect & Monitor (GA‑DM‑001) defines an AI‑specific event schema (user/role/device; tool; use_case_id; action such as prompt/output/upload/webhook; data‑classification tags; decision; connector/token ID) with detections and dashboards. Logs are routed to SIEM with integrity protection via Documentation & Compliance (GA‑DC‑001).

Human oversight & operational monitoring

High‑risk uses require documented oversight procedures and trained staff, with operational monitoring surfaced through GA‑DM dashboards (e.g., policy‑violation trends, exception backlog).

Transparency notices

GA‑DC‑001 includes transparency/AI labels among required evidence items (including label screenshots) within the Evidence Pack.

Robust data governance

Technical Protection (GA‑TP‑001) establishes SSO/MFA, secrets & token hygiene, AI‑aware DLP (prompt/output redaction, multi‑channel coverage), and egress allow‑lists; GA‑DC‑001 covers lawful basis, DPIA/FRIA, RoPA, retention schedules, transfer controls, and vendor documentation.

Correct misuse & co‑operate

Response & Remediation (GA‑RR‑001) provides AI‑specific runbooks (e.g., PII exfiltration, token compromise, misleading/deepfake content) and redress workflows, supporting corrective action and coordination with stakeholders.

Audit‑ready proof

The Evidence Pack (GA‑DC‑001) and evidence‑automation pattern specify sources (SIEM, DLP, CASB/Proxy, IdP, vendor portals), WORM/object storage with SHA‑256, and correlation keys (use_case_id, control_id, vendor_id, token_id, etc.).

Bottom Line

Implementing GenAI Assure™ as written—AI‑specific logging to SIEM with integrity controls, transparency labels captured as evidence, SSO/DLP/egress protections, documented DPIA/FRIA and transfer registers, and AI incident runbooks—provides the processes and artifacts the Framework maps to Article 26 duties for deployers.

Note: The Framework does not prescribe fixed log‑retention durations or quote fine amounts; retention and oversight must meet provider and legal requirements.

Ready to Implement These Strategies?

Our team can help you put these insights into practice with a tailored AI governance solution.

Get Your Readiness Assessment