The EU AI Act requires providers and deployers of AI systems to make sure that their staff understand the AI they use. In a new Q&A document, the European Commission gives more detail about what this rule means in practice and how it will be enforced. Notably it also suggests that in some circumstances firms would also be expected to train service providers and clients.
AI literacy goes beyond technical knowledge
Under the AI Act, “AI literacy” broadly means having the skills, knowledge and understanding to deploy AI systems. According to the Commission, the training should be tailored to:
- the knowledge and experience of the people being trained, and
- the sectoral context, the purpose of the AI system and the risks posed by it.
As well as technical knowledge, the Commission encourages businesses to use the training to cover awareness of ethical principles and understanding of the legal requirements under the AI Act.
Training requirement could extend to clients
The AI literacy requirement applies to both developers and users of AI. Training should be given to employees and “other persons” dealing with the operation and use of AI systems on their behalf.
According to the Commission, these “other persons” are not employees but “persons broadly under the organisational remit”. The examples it suggests include contractors, service providers, vendors, partners and clients.
The Commission clarifies that there is no legal obligation to measure AI knowledge of employees, except perhaps to pitch the training at the right level. There is also no need for firms to issue a certificate; keeping an internal record of training attendance is enough.
Minimum content
AI literacy training should cover, at least:
- General understanding: What AI is, how it works, how it is used and what the opportunities and dangers are.
- Role: Whether the organisation is an AI provider or deployer.
- Risk awareness: What employees need to know about the risks posed by AI systems – e.g. hallucinations – and any mitigations.
The training should also adapt to knowledge gaps. Firms should recognise differences in the expertise (e.g. non-technical vs. technical staff) and consider the context where the AI is applied e.g. financial services sector.
The AI literacy requirement applies to all kinds of AI system. Deployers of high-risk systems should note other related requirements under the Act e.g. to ensure that overseers of high-risk AI have the necessary training, authority and support.
Enforcement to begin next year
While the AI literacy requirement began to apply as of 2 February 2025, enforcement begins in 2026. From 2 August 2026, national market surveillance authorities will be responsible for supervising and enforcing compliance. The Commission says that enforcement should be proportionate. Proof of an incident due to lack of appropriate training would be an aggravating factor.
Firms in the scope of the AI Act should continue to build out their AI literacy programmes with a focus on how they tailor their training to different groups and to reflect their particular use of AI within the organisation.
Get in touch with us to hear more about how Linklaters can support your AI training.