EU AI Act for SMBs: what to prepare in 2026
The EU AI Act separates prohibited practices, high-risk systems, and ordinary AI use. Even everyday generative AI use needs clear rules, training, and records.
Start with an AI inventory
List every AI tool used in the company: ChatGPT, Microsoft Copilot, HR screening, scoring, customer support, and internal automations. For each tool, define the owner, purpose, input data, and expected impact.
The inventory helps separate ordinary productivity use from high-risk scenarios that require stricter documentation and human oversight.
- AI tool name and provider
- Purpose of use and responsible owner
- Type of data processed
- Risk for customers, employees, or decision-making
AI literacy
Companies need to show that people using AI understand system limits, hallucination risk, personal data rules, and human review requirements.
A short training session is not enough by itself. Keep a record of attendance, training version, and acknowledgement of the internal AI policy.
Policy and control points
An internal AI policy should explain what is allowed, what is forbidden, who approves new AI use, and how incidents or incorrect outputs are handled. Practical teams connect the policy to a checklist of controls and evidence.
Turn this article into a control checklist
Splnit.eu maps obligations to controls, evidence, and deadlines so they can be checked continuously.
View platform