← All regulations
EU and national authoritiesAugust 2026Regulation (EU) 2024/1689

EU AI Act

The EU AI Act classifies AI systems by risk. Organisations using AI need an inventory, AI literacy, transparency processes, and stronger controls for high-risk use cases.

Practical impact for EU SMBs.

  • Companies using AI for HR, finance, scoring, safety, customer support, or internal decision support.
  • SaaS teams providing AI features to customers in the EU.
  • Organisations using generative AI in operational processes.
  • Providers and deployers of high-risk AI systems listed in Annex III.

What needs to be provable.

Article 4

AI literacy

Train staff who use, supervise, or manage AI systems.

Due: August 2025

Article 26

AI use inventory

Record purpose, owner, risk, input data, and business process for each AI system.

Due: August 2026

Article 14

Human oversight

Assign responsibility for decisions supported by high-risk AI systems.

Due: August 2026

Article 50

Transparency

Inform users when they interact with AI or receive AI-generated content where required.

Due: depends on use

Risk is measured in money, contracts, and lost trust.

Violation typeMaximum sanctionEnforcement
Prohibited AI practicesEUR 35 million or 7% of worldwide turnoverNational authority
High-risk AI obligationsEUR 15 million or 3% of worldwide turnoverNational authority
Incorrect information to authoritiesEUR 7.5 million or 1% of worldwide turnoverNational authority

Turn obligations into controls, evidence, and deadlines.

AI inventory

Track tools, owners, purposes, risk levels, and review status.

AI policy

Generate working policies for acceptable AI use and human oversight.

AI literacy evidence

Record training completion and staff acknowledgement.

See the platform

Cookies

We use required cookies and optional traffic measurement to improve Splnit.eu.

EU AI Act | Splnit.eu — obligations, deadlines, and compliance automation