Every AI request has a hidden variable: where did the data go?

When you call an LLM API, your prompt travels to a server somewhere. For many applications, "somewhere" doesn't matter. For financial services, healthcare, government, and enterprise use cases, it matters a lot.

Data residency laws are multiplying globally. They create real compliance risk for AI deployments.

The regulatory patchwork

What started as a European concern (GDPR) has spread everywhere. Each jurisdiction adds its own requirements.

Brazil: LGPD

Brazil's data protection law went into effect in 2020 and has teeth.

  • Personal data of Brazilian residents must be processed in compliance with LGPD regardless of where the processor sits
  • Cross-border transfers require adequate protection guarantees
  • Fines up to 2% of Brazilian revenue, capped at R$50 million per violation

AI workloads involving personal data of Brazilian users need compliant infrastructure, or you need to document your legal basis for international transfer.

India: DPDP Act

India's Digital Personal Data Protection Act (2023) sets clear data localization requirements.

  • Certain sensitive data categories must be stored and processed within India
  • Cross-border transfers require government-approved adequacy determinations
  • Penalties up to ₹250 crore (~$30M USD) for significant violations

The government is building domestic AI compute through IndiaAI Mission. They expect AI workloads to run locally.

Indonesia: Government Regulation 71/2019

Indonesia mandates data localization for public systems and critical sectors.

  • Public sector data must stay in Indonesian territory
  • Financial services face strict localization requirements
  • Strategic data categories require in-country processing

Malaysia, Vietnam, and other ASEAN nations have similar rules.

EU: GDPR plus AI rules

Europe layers GDPR's data protection with new AI-specific regulations.

  • Data transfers outside the EU require Standard Contractual Clauses or adequacy decisions
  • The EU AI Act adds compliance requirements for AI systems
  • Schrems II invalidated Privacy Shield, complicating US data transfers

France's Secnumcloud certification has become required for French public sector clients. Most US hyperscalers don't qualify.

What non-compliance actually costs

Fines get attention, but they're often the smallest cost.

Direct penalties

The numbers:

  • GDPR: Up to €20M or 4% of global revenue
  • LGPD: Up to 2% of Brazilian revenue per violation
  • DPDP: Up to ₹250 crore per violation
  • Indonesia: Varies by sector, includes operational restrictions

But fines assume you get caught and prosecuted. More immediate costs are often worse.

Market access

Non-compliance increasingly means exclusion.

  • Government contracts require certified compliant infrastructure
  • Enterprise procurement teams check data residency in vendor assessments
  • Financial regulators can block non-compliant AI vendors

If you can't demonstrate compliance, you can't bid. Your addressable market shrinks.

Operational disruption

Enforcement can halt operations entirely.

  • Orders to cease data processing
  • Required notification of affected individuals
  • Mandated audits and remediation programs

For AI systems running core business operations, a processing suspension can be catastrophic.

Reputation

Data protection violations make news. Enterprise customers factor vendor compliance history into procurement. One violation can poison relationships for years.

Why AI workloads are especially exposed

Traditional software could sometimes route around data residency requirements. A web app might store user data locally while processing requests centrally.

AI doesn't have that option.

Inference requires data movement. When you call an LLM API, your prompt travels to wherever the model runs. If that's a US data center and your user is in Brazil, you may have violated LGPD.

Context windows compound exposure. Modern AI applications send conversation histories, RAG context, and user data. The exposure per request is substantial.

Third-party APIs multiply risk. Every AI provider in your stack is a compliance liability. If your app calls OpenAI, which processes in the US, you inherit that exposure.

Building for compliance

You don't need data centers in every jurisdiction. You need smart infrastructure.

Multi-region routing

Route AI inference to compliant compute based on request jurisdiction:

  • Brazilian user → Brazil-compliant infrastructure
  • Indian user → India-compliant infrastructure
  • EU user → GDPR-compliant infrastructure

This routing should happen automatically without code changes.

Provider questions to ask

  • Where is compute physically located?
  • What certifications does infrastructure hold?
  • Can you guarantee data doesn't leave the jurisdiction?
  • What's the audit trail?

Design for compliance early

  • Classify data by sensitivity and jurisdiction at ingestion
  • Route requests based on classification
  • Keep audit logs of where data was processed
  • Build compliance reporting into your observability

Retrofitting compliance costs more than building it in.

Bottom line

Data residency requirements are accelerating. The sovereign cloud market is projected at $823 billion by 2032 because enterprises recognize that compliance and AI capability can coexist.

Companies that treat sovereignty as a design principle will gain market access. Those that ignore it will watch their addressable market shrink.

Build for compliance now, or pay more to fix it later.

EXPLORE COMPLIANT INFERENCE