"Where does the data go?" It's the question every UK CTO should be asking about every AI tool in their stack — and most can't answer it. A 2025 DSIT survey found that 61% of UK enterprises using AI tools cannot confirm where their data is processed. For a country that has spent billions building data protection frameworks since GDPR, that's a remarkable blind spot. And it's one that the ICO is increasingly focused on closing.

By Toni Dos Santos, Co-Founder, Spicy Advisory

The Data Residency Question UK Companies Aren't Asking

Data residency — the physical and jurisdictional location where data is stored and processed — has always mattered for regulated industries. But AI has made it a board-level concern for every organisation, for three reasons.

AI Changes the Data Flow Equation

Traditional SaaS tools have relatively simple data flows: data goes in, gets stored in a known location, and stays there. AI tools are fundamentally different. When an employee enters a prompt into an AI assistant, that data may travel through:

Each of these stages may involve data being processed in different jurisdictions. A UK employee using a US-headquartered AI tool may have their data processed in the US, stored in Ireland, and logged in a third location entirely.

UK GDPR and Cross-Border Transfers Post-Brexit

The UK's data protection regime — the UK GDPR and Data Protection Act 2018 — imposes strict requirements on international data transfers. Data can only be transferred outside the UK to countries with an adequacy decision, or where appropriate safeguards are in place (Standard Contractual Clauses, Binding Corporate Rules, etc.).

Post-Brexit, the UK has its own adequacy framework separate from the EU's. The UK has granted adequacy to the EU, EEA, and several other jurisdictions. For US transfers, the UK Extension to the EU-US Data Privacy Framework provides a mechanism — but only for US organisations that have self-certified under the framework. Not all AI providers have completed this certification.

ICO Enforcement Is Intensifying

The ICO's enforcement priorities for 2025-26 explicitly include AI and automated decision-making. The ICO has been conducting proactive audits of organisations' AI data processing practices, with particular focus on transparency, data minimisation, and international transfers. Several enforcement actions in 2024-25 specifically cited inadequate data transfer safeguards for AI tools.

The message is clear: if your organisation is using AI tools and you can't demonstrate where data is processed, what safeguards are in place, and what legal basis underpins the processing, you're exposed.

Mapping the AI Tool Landscape by Data Residency

Not all AI providers are equal when it comes to data residency. Here's what UK enterprises need to know about the major platforms:

ProviderUK Data Residency Available?EU Data Residency Available?Training Data Opt-OutEnterprise DPA
Microsoft (Azure OpenAI / Copilot)Yes — Azure UK South, UK WestYes — multiple EU regionsYes — enterprise agreements exclude trainingYes — comprehensive DPA
Google (Gemini / Workspace AI)Limited — some Workspace featuresYes — EU data boundaryYes — enterprise Workspace excludes trainingYes — Cloud DPA
OpenAI (ChatGPT / API)No — US processing (API)No — US processingAPI: Yes. Consumer: Opt-out availableAPI: Yes. Consumer: Limited
Anthropic (Claude / API)No — US processingNo — US processingAPI: Yes. Consumer: Opt-out availableAPI: Yes. Consumer: Limited
Notion AINo — US processingNo — US processingYes — enterprise plansYes — enterprise DPA
Slack AIPartial — depends on Slack data residencyPartial — depends on Slack data residencyYes — not used for trainingYes — Salesforce DPA

Key insight: Microsoft is currently the only major AI provider offering genuine UK data residency for AI workloads. Google offers EU data residency but UK-specific options are limited. OpenAI and Anthropic process all data in the US, relying on contractual safeguards (SCCs, DPA) rather than geographic residency. For UK enterprises with strict data residency requirements — particularly in financial services, legal, healthcare, and public sector — this significantly narrows the field.

Consumer vs Enterprise: A Critical Distinction

One of the most important distinctions for UK enterprises is between consumer and enterprise versions of AI tools:

This distinction must be central to your AI usage policy. Consumer AI tools should be prohibited for any use involving personal data, client data, or commercially sensitive information.

The Spicy AI Data Residency Audit Framework

We've developed a five-step framework that UK enterprises can use to assess and manage data residency risk across their AI tool portfolio.

Step 1: Data Classification

Before you can assess data residency risk, you need to know what data is flowing into AI tools. Classify your data into five categories:

Your AI usage policy should specify which data categories are permitted in which AI tools.

Step 2: Vendor Data Flow Mapping

For each AI tool in use, map the complete data flow:

Request this information formally from each vendor. If a vendor cannot provide clear answers to these questions, that itself is a red flag.

Step 3: Contractual Safeguards

Ensure appropriate contractual protections are in place for each AI tool:

Step 4: Technical Controls

Contractual safeguards are necessary but not sufficient. Technical controls add a practical layer of protection:

Step 5: Ongoing Monitoring

Data residency is not a one-time assessment. AI providers regularly update their infrastructure, data processing practices, and terms of service:

Sector-Specific Considerations

While the framework above applies universally, certain UK sectors have additional data residency requirements:

Financial Services

The FCA and PRA expect financial services firms to maintain operational resilience for critical business services — which increasingly includes AI tools. PS21/3 (Operational Resilience) requires firms to identify important business services and set impact tolerances. AI tools that process customer financial data may need to be classified as part of important business services, with data residency considered as part of the resilience assessment.

Legal Sector

Legal professional privilege adds a unique dimension. If privileged communications are entered into AI tools, the privilege may be waived if the data is accessed by third parties (including the AI provider) without appropriate safeguards. Law firms should treat all AI tools processing client data as requiring the highest level of data residency assurance.

Public Sector

UK Government Cloud guidelines require that OFFICIAL data is processed within the UK or in countries with adequate data protection. For OFFICIAL-SENSITIVE and above, UK-only data residency is typically mandatory. Public sector organisations adopting AI must ensure their tools meet these classification requirements.

Need help auditing your AI tool portfolio for data residency compliance? Spicy Advisory's AI Data Residency Audit uses our five-step framework to map your data flows, assess your contractual safeguards, and identify gaps before the ICO does. Book a discovery call.

Frequently Asked Questions

Does UK GDPR require data to stay in the UK?

No. UK GDPR does not require data to remain in the UK. It requires that any transfer of personal data outside the UK has appropriate safeguards in place. Data can be transferred to countries with UK adequacy decisions (including EU/EEA countries) without additional safeguards. For transfers to other countries (including the US), organisations must use appropriate transfer mechanisms — typically Standard Contractual Clauses (the UK IDTA or UK Addendum to EU SCCs), or rely on the UK Extension to the EU-US Data Privacy Framework for certified US organisations. The key obligation is not geographic restriction but ensuring equivalent protection wherever data is processed.

Is Microsoft Copilot data processed in the UK?

Microsoft offers UK data residency for Microsoft 365 Copilot through its Azure UK South and UK West data centres. For organisations with UK-based Microsoft 365 tenants, Copilot data processing and storage can be configured to remain within UK data centres. However, the specifics depend on your Microsoft licensing agreement, tenant configuration, and which Copilot features you use. Some advanced features may involve processing outside the UK. We recommend reviewing Microsoft's data residency documentation for Copilot specifically, and confirming the data processing location in your enterprise agreement. Microsoft's commitment to not training its foundation models on enterprise customer data applies regardless of data residency location.

What is the difference between API and consumer AI tools for data residency?

The difference is significant and often misunderstood. Consumer AI tools (free versions of ChatGPT, Claude, Gemini) typically process data under the provider's consumer terms of service, which may include broader rights to retain, analyse, and potentially use data for model improvement. Enterprise API access operates under a separate commercial agreement with a Data Processing Agreement that provides contractual commitments on data handling, retention, and training exclusion. For UK enterprises, the practical implication is that consumer AI tools should not be used for any processing involving personal data or commercially sensitive information, while enterprise API access — with appropriate contractual safeguards — can be used for a wider range of business purposes.

Do AI companies use my data to train their models?

It depends on which version of the tool you use. For consumer/free versions: historically, most AI providers used consumer interactions for model training. This is changing — OpenAI, Anthropic, and Google all now offer opt-out mechanisms for consumer users. However, even with opt-out, data may still be retained for safety monitoring and abuse detection. For enterprise/API versions: reputable AI providers contractually commit to not using enterprise customer data for model training. This should be explicitly stated in your Data Processing Agreement. Always verify this commitment in your specific enterprise agreement — don't assume based on the provider's general marketing statements. The distinction between consumer and enterprise data handling is one of the most important factors in AI procurement for UK businesses.