Skip to content.

The European Union recently published guidance on the training that companies should provide to employees to achieve “AI literacy” required under the EU AI Act – and like so much else about artificial intelligence these days, that guidance has both good news and bad news for corporate compliance officers.

The good news is that the guidance, released on May 7, isn’t terribly prescriptive. It describes a set of goals companies should achieve for AI literacy among their workforce, and considerations you should take into account as you develop your training program to do so.

That also brings us to the bad news: there is no clearly defined, check-the-box path to achieving compliance with this provision (Article 4) of the AI Act. Instead, compliance officers will need to work with leaders of First and Second line operations teams to think about how your organization wants to use AI, the risks inherent in those plans, and what your employees will need to know to avoid stepping on those AI-enhanced risk landmines.

First, some background on Article 4 of the EU AI Act

Article 4 of the EU AI Act says that all “providers and deployers of AI systems” should take steps to assure that their employees and third parties (say, contractors working on your behalf) have “a sufficient level of AI literacy” to operate those systems responsibly. Put simply, Article 4 requires businesses to train their employees on how to use AI wisely.

And what does “AI literacy” mean? Article 4 defines it as the “skills, knowledge, and understanding” necessary so that employees can “make an informed deployment of AI systems, as well as gain awareness about the opportunities and risks of AI and possible harm it can cause.”

Enforcement of Article 4 won’t start until August 2026, and the penalties for non-compliance aren’t yet entirely clear. Each EU member state is supposed to adopt its own law implementing Article 4, and those laws will include the penalties a company would face for non-compliance.

That said, let’s appreciate the bigger and more realistic picture here. Do you really want to neglect to train your employees on how to use AI responsibly? That’s a recipe for disaster no matter what the EU or any other jurisdiction demands.

So, let’s take a close look at the EU’s Article 4 guidance and see how it can guide the AI training you’re going to need no matter what.

AI literacy should hit on four fundamental points

The guidance begins by stating the obvious: that artificial intelligence is evolving too quickly, with too many potential use cases, for a precise training requirement. Companies should instead strive to deliver training that addresses four points.

4 ways to enable AI literacy

  1. Provide a general understanding of where and how AI works within your organization, including mention of the risks and opportunities
  2. Reflect your company’s role as either a provider of AI systems (which brings one set of risks) or a user of AI systems (which brings a different set)
  3. Address the risks of the actual AI tools you’re using. For example, one primary risk for generative AI is hallucinations; for predictive analytics AI, it’s algorithmic discrimination. For other systems, it could well be both
  4. AI literacy training should be role-based, accounting for the employee’s job, experience with AI, and how they are likely to use AI systems

The guidance also stresses that simply relying on the AI system’s instructions, or just telling employees to “read the manual,” will often not be enough to satisfy Article 4. Companies will need to sit down and think about:

  • How they want to use artificial intelligence
  • Who specifically within the company will use what AI systems
  • What training those groups will need to use AI in a compliance- and risk-aware manner

What’s interesting here is that broadly speaking, these are the same four principles a compliance officer would use to guide your anti-corruption training. That is, you’d always want to understand how your business works and how corruption might creep into operations. You’d always want to provide training that focuses on anti-corruption broadly rather than the Foreign Corrupt Practices Act specifically. You’d want role-based training, where an overseas sales agent receives different training from an accounts payable clerk approving invoices.

So, the task here is to bring together the right senior people within your enterprise to talk about AI usage; and then decide on the proper level of AI literacy your employees (and third parties!) will need.

Start building your AI training capabilities now

As we mentioned earlier, the enforcement climate for Article 4 isn’t yet clear – but really, does it need to be? Reckless use of artificial intelligence can cause great harm to businesses right now. Some of that harm includes violations of other regulations, such as privacy laws; but it could also include huge disruptions to operations, bad headlines, or civil litigation.

Indeed, that should be your first message to senior management and business operations leaders: that a thoughtful approach to AI literacy and training is a business priority as much as it is a compliance obligation.

Then what? Consider a few practical steps to take ASAP.

Form an AI governance committee. The committee should include leaders from across the enterprise (IT, security, legal, finance, business operations, and of course compliance). It can then define your primary AI risks, draft policies for acceptable AI usage, and identify key risk indicators you should monitor.

Inventory all the AI systems you use. You won’t be able to teach AI literacy or manage AI risks without knowing how many AI systems are running within your business and what those systems are doing. The inventory should also include a list of all data sources, internal and external, that each AI system uses.

Match employees to AI systems and risks. From there, you can develop the role-based training that Article 4 wants to see, and which just makes good sense.

Revisit all of your AI inventories, risks, and training on a regular basis, and update as necessary – because companies’ use of artificial intelligence is going to go nowhere but up, and your risks will keep evolving. The AI literacy your employees will need will keep evolving too.

For more information about the EU AI Act, click the link below.

Tell me more