ComplianceEU AI Act
Last updated 8 July 2025
8 min read

EU AI Act Compliance Checklist 2025: 10 Practical Steps for Small- & Mid-Sized Businesses

From 2 August 2025, national regulators must start enforcing the first wave of the EU AI Act. Get ready with this fast-track, SME-friendly checklist.

Urgent Compliance Alert

From 2 August 2025, national regulators must start enforcing the first wave of the EU AI Act: transparency banners for chatbots, deep-fake labelling, mandatory logging for general-purpose AI, and outright bans on "unacceptable-risk" uses. The law's headline fine—€35 million or 7% of global turnover under EU AI Act Article 99 penalties—applies even to SMEs.

A fresh EU AI Act timeline confirmed last week that Brussels will not delay the deadline, despite industry pressure. Yet only 13.5% of EU firms used AI in 2024.

Below is a fast-track, SME-friendly checklist—plus every link you need to verify each step.

⚡ What's new as of July 2025

Code of Practice delayed

The Commission now says its voluntary GPAI guidance may slip to "end 2025," leaving companies to self-interpret in the meantime. (Reuters)

No deadline extension

Brussels insists the 2 August obligations stand, rejecting a two-year postponement requested by 45 major tech firms. (Reuters)

The 10-Step SME Checklist

1

Map your AI footprint

Catalogue every in-house or SaaS system that uses machine learning.

Why it matters: Inventory unlocks free help inside an Article 57 regulatory sandbox for SMEs. (Learn more)

2

Classify risk

Tag each system as unacceptable, high, limited, or minimal risk.

Why it matters: Only high-risk systems need the full QMS; chatbots usually fall under limited-risk transparency duties. (Learn more)

3

Tighten data governance

Check that training data are representative, error-free, and documented.

Why it matters: Article 10 demands this; pair it with CISA AI Data Security Best Practices for encryption and tamper-proof audits. (Learn more)

4

Set up continuous logging

Pipe model inputs/outputs into an audit store (e.g., Loki + Grafana).

Why it matters: Logs must let regulators reproduce any output after 2 Aug 2025. (Learn more)

5

Bias & performance testing

Publish short test summaries on bias, robustness, and cybersecurity.

Why it matters: GPAI providers and all high-risk deployers must do this—delay on guidance is no defence. (Learn more)

6

Embed human oversight

Add a Pause / Override switch and train at least two staff per model.

Why it matters: Article 14 requires 'effective human oversight' throughout the life-cycle. (Learn more)

7

Strengthen transparency notices

Label chatbots, watermark synthetic images, and offer opt-outs.

Why it matters: Transparency duties for limited-risk systems start this year. (Learn more)

8

Vet vendors & contracts

Add AI-specific clauses on dataset ownership, IP, and bias indemnities.

Why it matters: See the LexisNexis AI Technology Risks Checklist for legal templates. (Learn more)

9

Refresh privacy & consent

Update privacy notices for any new automated decision-making or cross-border transfers.

Why it matters: Clear, GDPR-aligned language lowers complaint risk. (Learn more)

10

Adopt a living governance blueprint

Align controls, owners, and evidence in one dashboard; review quarterly.

Why it matters: The Relyance AI Governance Blueprint maps controls straight to EU AI Act, GDPR, HIPAA, and more. (Learn more)

Key Take-aways

Act now—guidance delays don't stop fines

Inventory + risk tagging covers ~80% of early compliance work.

SME sandboxes provide free legal templates and testing support; apply early to beat the queue.

Need Help Getting Compliant?

Book a free 30-minute gap analysis with Pro AI Assistant—we'll map your tools, tag risks, and hand you a plug-and-play policy pack.

Book Free Gap Analysis

Sources

Ready for EU AI Act Compliance?

Our AI compliance experts can help you navigate the EU AI Act requirements and ensure your business stays compliant while maximizing AI benefits.