Your AI training programme covers UK compliance. Or it covers EU compliance. But does it cover both? Since Brexit, the UK and EU have diverged sharply on AI regulation. The EU's AI Act imposes prescriptive, risk-based obligations. The UK's approach relies on principles and sector-specific regulators. For the estimated 45,000 UK companies that trade with or operate in the EU, this regulatory fork creates a dual-compliance challenge that most AI training programmes completely ignore.

By Toni Dos Santos, Co-Founder, Spicy Advisory

Two Paths, One Workforce

Let's map the divergence clearly. The UK and EU started from the same regulatory foundation — the GDPR — but have moved in fundamentally different directions on AI.

The EU AI Act: Prescriptive and Risk-Based

The EU AI Act, which entered into force in August 2024 with phased implementation through 2026, is the world's first comprehensive AI regulation. It classifies AI systems into four risk categories:

The penalties are severe: up to €35 million or 7% of global annual turnover for the most serious violations.

The UK Approach: Principles-Based and Sector-Specific

The UK government's Pro-Innovation Approach to AI Regulation, first published in March 2023 and updated through 2025, takes a deliberately different path. Instead of a single horizontal regulation, the UK establishes five cross-cutting principles — safety, transparency, fairness, accountability, and contestability — and delegates enforcement to existing sector-specific regulators.

This means:

There is no formal risk classification system, no mandatory conformity assessment, and no centralised AI registry. The UK approach gives companies more flexibility but also less certainty about what compliance looks like.

Why This Divergence Matters for Training

Here's the problem: 73% of UK companies with EU operations have no dual-compliance AI training programme in place, according to a 2025 survey by the CBI. Most either train to UK standards only (assuming EU compliance will sort itself out) or train to EU standards only (over-engineering for domestic use). Neither approach is adequate.

The Compliance Training Gap: What Most UK Companies Are Missing

The most dangerous gap in UK AI training is the EU AI Act's extraterritorial scope. Article 2 of the EU AI Act applies to:

This means a UK-headquartered company that sells an AI-powered product to EU customers, or whose AI system produces outputs consumed by EU-based users, must comply with the EU AI Act — even though the company is no longer in the EU.

The practical implications for training are significant:

DimensionUK ApproachEU AI Act
Regulatory modelPrinciples-based, sector-specificPrescriptive, horizontal regulation
Risk classificationNo formal systemFour-tier: unacceptable, high, limited, minimal
Conformity assessmentNot requiredMandatory for high-risk systems
AI registryNoneEU database for high-risk systems
Enforcement bodyExisting regulators (ICO, FCA, etc.)National authorities + EU AI Office
Maximum penaltiesVaries by regulator (e.g., ICO: £17.5M/4%)€35M or 7% of global turnover

The Spicy Dual-Compliance AI Training Matrix

We've developed a role-based training matrix that maps exactly who needs to know what about each regulatory regime. The principle is simple: not everyone needs to be an expert in both frameworks, but specific roles need specific knowledge of specific obligations.

Tier 1: All Staff — AI Regulatory Awareness (2 Hours)

Every employee who uses or is affected by AI needs a baseline understanding of both frameworks. This covers:

Tier 2: Legal, Compliance, and DPO Teams — Deep Regulatory Dive (1 Day)

These teams need comprehensive knowledge of both frameworks:

Tier 3: Product, Engineering, and Data Teams — Technical Compliance (1 Day)

Teams building or deploying AI systems need practical compliance skills:

Tier 4: Procurement Teams — Vendor Compliance (Half Day)

Procurement teams are the gatekeepers for AI tools entering the organisation:

Tier 5: C-Suite and Board — Strategic Regulatory Briefing (Half Day)

Senior leaders need to understand the strategic implications:

Implementation: The 90-Day Sprint to Dual Compliance

Moving from single-market to dual-market AI compliance readiness doesn't require a year-long programme. Here's our recommended 90-day sprint:

Spicy Advisory's cross-border positioning — headquartered in Paris, serving clients across the UK and EU — gives us unique insight into how both regulatory regimes operate in practice. We've seen how French companies navigate the EU AI Act and how UK companies adapt their governance frameworks. This dual-market experience informs every aspect of our training programmes.

Operating across the UK and EU? Need your teams trained on both regulatory frameworks? Spicy Advisory is uniquely positioned to deliver dual-compliance AI training — we're based in Paris with deep expertise in both the EU AI Act and UK regulatory landscape. Our Dual-Compliance AI Training Matrix is tailored to your organisation's specific cross-border exposure. Book a discovery call.

Frequently Asked Questions

Does the EU AI Act apply to UK companies?

Yes, in many cases. The EU AI Act has extraterritorial scope under Article 2. It applies to UK companies that place AI systems on the EU market, deploy AI systems within the EU, or produce AI outputs that are used within the EU. For example, a UK fintech whose credit-scoring AI is used by EU-based customers must comply with the EU AI Act's requirements for high-risk AI systems — including conformity assessments, technical documentation, and human oversight requirements — even though the company is headquartered in the UK. Companies that have no EU customers, operations, or outputs used in the EU are not affected.

What is the UK equivalent of the EU AI Act?

The UK does not have a direct equivalent of the EU AI Act. Instead of a single comprehensive AI regulation, the UK follows a principles-based approach where existing sector-specific regulators apply five cross-cutting principles (safety, transparency, fairness, accountability, contestability) to AI within their domains. The ICO handles AI and personal data, the FCA covers financial services AI, and so on. The UK government has signalled that more formal AI legislation may come in 2026-2027, but for now, organisations must navigate a patchwork of existing regulations applied to AI contexts. This gives companies more flexibility but requires them to understand which regulator applies to each AI use case.

How do ICO and CNIL enforcement approaches differ?

The ICO (UK) and CNIL (France, and a leading EU enforcer) have notably different enforcement styles. The ICO tends to be more pragmatic and engagement-focused — it often issues guidance and recommendations before formal enforcement, and its fines have historically been lower than those of some EU counterparts. The CNIL is more aggressive and precedent-setting — it has issued some of the largest GDPR fines in Europe and has been proactive on AI-specific enforcement. For AI compliance, the CNIL has published detailed recommendations on AI and personal data that go beyond ICO guidance in specificity. In practice, organisations should design their compliance frameworks to meet the stricter standard (typically the EU/CNIL approach) and then adjust for UK-specific requirements.

What training do procurement teams need for AI compliance?

Procurement teams are critical but often overlooked in AI compliance training. They need practical skills in four areas: evaluating vendor AI compliance documentation (does the vendor have conformity assessments, technical documentation, and risk assessments for high-risk systems?); understanding key contractual clauses for AI procurement (liability allocation, data processing agreements, transparency commitments, audit rights, model training opt-outs); assessing data residency and cross-border transfer mechanisms (where is data processed, what transfer mechanisms are in place?); and determining whether a vendor's AI system qualifies as high-risk under the EU AI Act (which triggers additional procurement due diligence). A half-day focused workshop with practical exercises — such as reviewing real AI vendor contracts — is typically sufficient.