AI Privacy Pro Team15 min read

Enterprise AI DPAs vs. Zero Data Retention (ZDR): What Professionals Need to Know

An enterprise buyer's guide to AI Data Processing Addendums (DPAs) and Zero Data Retention (ZDR) agreements across ChatGPT Enterprise, Claude Enterprise, Gemini Enterprise, and Microsoft 365 Copilot—covering pricing, contractual scope, common misconceptions, and a defense-in-depth posture combining ZDR with local pre-transmission anonymization.

AI DPAZero Data RetentionZDR AgreementChatGPT EnterpriseClaude EnterpriseGemini EnterpriseMicrosoft 365 CopilotEnterprise AI PrivacyGDPRHIPAALegal EthicsCamoTextDefense in Depth

TL;DR

A signed Data Processing Addendum (DPA) is not the same thing as a Zero Data Retention (ZDR) agreement. For enterprise chat seats, a DPA does not equal ZDR. Treat ChatGPT, Claude, Gemini, and Copilot Enterprise/Business products as controlled-retention enterprise products, not zero-retention products, unless the contract explicitly says ZDR applies to the specific surface you are using. A ZDR amendment plus a pre-transmission local anonymization step (for example, CamoText) is defense in depth: contractual protection at the vendor layer and technical protection at the transmission layer. Either alone is better than neither; both together is the most defensible posture under ethics frameworks like the New York City Bar's Formal Opinion 2024-5.

Why the DPA vs. ZDR Distinction Matters

Most enterprise procurement teams reach for "the DPA" as their primary privacy artifact when rolling out an AI service. A DPA is necessary but not sufficient. It establishes that the vendor is acting as a data processor on the customer's behalf, defines the scope of permitted processing, prohibits use of customer data for model training, and sets out security and breach-notification commitments. What it generally does not do—on its own—is force the vendor to immediately discard prompts and responses after each request.

Standard enterprise DPAs from the major AI providers contemplate controlled retention: prompts, outputs, and intermediate state are typically retained for a bounded period (commonly 30 days, sometimes longer for chat history features) for purposes like abuse monitoring, debugging, and customer-controlled session resumption. That is a legitimate enterprise default, but it is materially different from the "nothing is stored after the response is returned" posture many buyers assume they are getting.

What "DPA" actually covers

  • Roles and responsibilities under GDPR / UK GDPR / CCPA (controller vs. processor)
  • Sub-processor disclosures and right-to-object provisions
  • Security commitments (encryption in transit and at rest, access controls, audit certifications)
  • Breach notification timelines
  • Cross-border transfer mechanisms (Standard Contractual Clauses, UK IDTA, etc.)
  • Limitations on use of customer data for training (typically: not used by default for the enterprise tier)
  • Deletion on termination of the underlying agreement

What "ZDR" adds on top

  • Customer prompts and model outputs are not stored at rest after the response is returned
  • Abuse-monitoring logs are sanitized of identifiable customer content (or, in stricter ZDR variants, suppressed altogether)
  • Identifying metadata such as IP addresses and account IDs is stripped from any retained operational records
  • The arrangement is typically scoped to specific endpoints or product surfaces, not the vendor's entire offering
  • It almost always requires a separate amendment, eligibility review, and per-organization enablement
Common procurement mistake

"We signed the DPA, so prompts aren't stored." Almost universally false for enterprise chat products. The DPA limits how data can be used; it generally does not promise that data isn't retained at all. ZDR is the contract clause that promises non-retention, and it is usually only available on specific API endpoints—not on the chat UI your employees actually use.

Visualizing the Difference

The two diagrams below show the same prompt traveling through (1) an enterprise product governed by a DPA only and (2) the same product with a ZDR amendment in place. The contractual surface is identical; the technical retention surface is not.

DPA only (controlled retention)User promptVendor APIprocesses inferenceAbuse-monitoring logprompt + output retained ~30dChat history / threadsuntil user/admin deletesDPA + ZDR amendment (zero data retention)User promptEligible endpointprocesses inferenceNo content at restsanitized abuse signals onlyNo persistent threadunless feature is excludedZDR is endpoint- and surface-specific. Persistent chat features (threads, history, vector stores, file uploads) are commonly excluded from ZDR coverage.
Figure 1. Data flow under a DPA-only arrangement versus a DPA plus ZDR amendment for the same enterprise AI service.

The Major Vendors at a Glance

The table below summarizes how each of the four most common enterprise AI services treats DPA coverage, ZDR availability, and pricing. Always verify against the current product terms before relying on any of these summaries for a specific deployment.

ServiceList price (per seat / per user)DPA available?ZDR available?ZDR scope
ChatGPT Business~$25/user/mo (annual) or $30 (monthly)YesNo (chat surface)N/A — controlled retention only
ChatGPT EnterpriseCustom (commonly ~$60/seat/mo, 150-seat min historically)YesLimited — not for the chat UIAPI endpoints only (Chat Completions, Responses, embeddings, etc.)
OpenAI API platformPay-as-you-go (per-token model pricing)YesYes — by amendmentEligible endpoints only; sales approval required
Claude Team~$25/user/mo (annual) or $30 (monthly)YesNoN/A
Claude Enterprise$20/seat/mo + usage at API rates (self-serve); custom (sales-assisted, 50-seat min)YesLimited — not for the chat product interfaceClaude Code on Claude for Enterprise (per-org enablement); commercial API keys for Messages and Token Counting APIs
Anthropic APIPay-as-you-go (per-token)YesYes — by amendmentMessages API and Token Counting API; not Console/Workbench
Gemini Enterprise (Workspace)Bundled with Workspace Business/Enterprise (no add-on cost as of 2025+)Yes (Workspace DPA)No (chat surface)N/A — Workspace data protections apply
Gemini API / Vertex AIPay-as-you-go (per-token)YesYes — per-project approvalLogging sanitized of user content; certain features (Grounding, Live API, File API, Caching) excluded or require opt-out
Microsoft 365 Copilot$30/user/mo (annual)Yes (Microsoft Products and Services DPA)No (controlled retention via tenant policy)"Enterprise Data Protection" (EDP) — prompts/responses logged under tenant retention policies; not ZDR
Azure OpenAI ServicePay-as-you-go (per-token)Yes"Modified abuse monitoring" (closest analog) — by applicationApproved customers can have content excluded from abuse-monitoring storage

Pricing summarized from public list pricing as of April 2026. Treat as orientation; verify on each vendor's current pricing page and order form. Enterprise discounts, BAA add-ons, and HIPAA-ready configurations frequently change effective price.

Vendor-by-Vendor Notes

OpenAI: ChatGPT Enterprise, ChatGPT Business, and the API platform

OpenAI's enterprise DPA covers both the ChatGPT Enterprise/Business chat products and the API. By default, API customer data is retained for up to 30 days and ChatGPT Enterprise data is retained for the term of the agreement (with customer-configurable retention controls). Models are not trained on enterprise customer data.

ZDR is a separate amendment. It is approved on a per-organization basis through OpenAI sales, and it applies only to specific eligible API endpoints—primarily /v1/chat/completions, /v1/responses, embeddings, moderation, and image generation. Persistent surfaces such as /v1/conversations, /v1/threads, /v1/assistants, vector stores, and file storage are not ZDR-eligible because they require state to function. The ChatGPT Enterprise chat UI is itself not a ZDR product; persistent chat history is part of how the product works. Any non-US data residency configuration also requires the ZDR amendment to be in place.

Anthropic: Claude Enterprise and the Anthropic API

Anthropic's Claude Enterprise plan is now usage-based: $20/seat/month (annual, self-serve) plus token usage billed at API rates. By default, Anthropic does not train on enterprise customer data, and custom data retention controls are part of the enterprise feature set. HIPAA-ready configuration with a BAA is available on the sales-assisted plan only.

Anthropic's ZDR coverage is documented at the Claude API ZDR page and is more conservative than many buyers expect. ZDR applies to:

  • The Claude Messages API and Token Counting API (using commercial organization API keys)
  • Claude Code when used with commercial API keys, or when used through Claude for Enterprise with ZDR enabled at the organization level

ZDR explicitly does not apply to Console / Workbench, consumer plans (Free, Pro, Max), or the Claude Teams and Claude Enterprise chat product interfaces. It must be enabled separately for each new organization. Deployments on AWS Bedrock, Google Vertex AI, or Microsoft Foundry are governed by those platforms' retention policies, not Anthropic's.

Google: Gemini Enterprise (Workspace) and Gemini API / Vertex AI

As of January 2025, Gemini AI features are bundled into Google Workspace Business and Enterprise editions at no additional add-on cost; existing Workspace data protections apply automatically, including the Workspace DPA. Workspace customer interactions with Gemini are not used to train foundation models without permission. This is a strong controlled-retention posture, not ZDR.

For developer use of the Gemini API, Google's ZDR documentation explains a per-project approval model. When ZDR is enabled, abuse-monitoring records are sanitized of customer content and identifiable metadata. Several features are excluded from ZDR or require explicit opt-outs to achieve a true zero-data footprint:

  • Grounding with Google Search/Maps — 30-day retention, no opt-out
  • Live API — store conversation state for resumption (don't configure SessionResumptionConfig)
  • Interactions API — set store=false
  • File API — files persist until manually deleted
  • Explicit context caching — cached content stored for the configured TTL

For organizations needing self-serve ZDR controls at scale, Google's Gemini Enterprise Agent Platform exposes a more granular set of toggles and is the recommended path for large regulated deployments.

Microsoft: 365 Copilot and Azure OpenAI Service

Microsoft 365 Copilot at $30/user/month (annual) is governed by the Microsoft Products and Services DPA and Product Terms, with Microsoft acting as a data processor. Microsoft markets this as Enterprise Data Protection (EDP): prompts and responses are logged so that tenant-managed retention, sensitivity labels, audit, and Purview policies all apply. EDP is not ZDR. It is a strong, well-integrated controlled-retention model, but the tenant must actively configure retention policies in Microsoft Purview to shrink the retention footprint.

Azure OpenAI Service has its own DPA and supports a "Modified Abuse Monitoring" path that is the closest equivalent to ZDR—approved customers can have their prompts and completions excluded from human review and abuse-monitoring storage. This is gated on a formal application that includes a low-risk-use-case attestation.

Defense in Depth: ZDR Plus Pre-Transmission Anonymization

A signed ZDR amendment is contractual protection at the vendor layer: a promise, backed by audit and breach-notification provisions, that customer content is not retained at rest. Pre-transmission anonymization is technical protection at the transmission layer: sensitive identifiers never leave the workstation in the first place. Each defends against a different failure mode.

The two failure modes ZDR alone does not cover

  • Scope leakage. Many ZDR arrangements exclude specific features (file uploads, vector stores, persistent threads, grounding, caching). It is easy for an employee or pipeline to invoke an excluded surface without realizing it, leaving raw content in vendor storage despite the amendment.
  • In-flight or in-memory exposure. ZDR addresses storage at rest. It does not eliminate the fact that the prompt is processed in vendor-controlled memory and transits the vendor's edge, monitoring, and inference infrastructure. If sensitive identifiers are removed before transmission, the worst-case exposure even in an edge compromise is greatly reduced.

Layering local anonymization tools such as CamoText in front of the AI service replaces names, identifiers, account numbers, and other sensitive entities with deterministic placeholders before the request is serialized over the network. The hash key needed to reverse the substitution stays on the user's device. The vendor receives a working but anonymized prompt; the user sees a de-anonymized response.

Defense in depth: local anonymization + ZDR vendor endpointUser workstationprompt with PIILocal anonymizerreplaces names/IDs;key stays localZDR endpointno content at rest;sanitized telemetryResponse backde-anonymized locallyVendor layer (contractual): ZDR amendment limits storage on the provider side.Transmission layer (technical): anonymization removes identifiers before they leave the device.Either alone is better than neither. Both together is the most defensible posture.
Figure 2. Layered protection: local pre-transmission anonymization in front of a ZDR-eligible vendor endpoint.

Why this layered posture matches the New York legal-ethics framework

The NYC Bar's Formal Opinion 2024-5 and the New York State Bar Association's April 2024 Task Force report both ground their analysis in NY Rule of Professional Conduct 1.6. Two requirements are doing most of the work:

  • A lawyer must "make reasonable efforts to prevent the inadvertent or unauthorized disclosure" of confidential client information.
  • A lawyer "must not input confidential client information into any Generative AI system that will share the inputted confidential information with third parties" without informed client consent. Anonymization of client information removes the consent requirement: "Consent is not needed if no confidential client information is shared, for example through anonymization of client information."

A standalone enterprise DPA is a meaningful step but does not, on its own, eliminate information sharing—it constrains how the vendor may use the information that is shared. ZDR meaningfully shrinks that sharing window. Pre-transmission anonymization structurally removes confidential identifiers from the data crossing the trust boundary, which maps directly to the language Opinion 2024-5 highlights ("anonymization of client information"). Combining the two gives the practitioner both the documentary and the structural grounds to demonstrate "reasonable efforts" under Rule 1.6.

This is not legal advice.

Bar ethics frameworks vary by jurisdiction and continue to evolve. The patterns described here generalize well to GDPR data minimization, HIPAA's "minimum necessary" standard, FERPA, and financial-services confidentiality obligations, but specifics—particularly around informed consent and disclosure—should be evaluated by counsel for your jurisdiction and matter type.

Common Procurement Pitfalls

1. Treating chat seats as ZDR products

The four largest enterprise AI chat products—ChatGPT Enterprise, Claude Enterprise, Gemini Enterprise, and Microsoft 365 Copilot—are controlled-retention products by design. Persistent chat history, memory, and tenant-side retention policies are features, not bugs. None of them are ZDR products on the chat surface. If a procurement document asserts the contrary, treat it as an unverified claim.

2. Confusing "no training" with "no retention"

Every major enterprise tier promises that customer data is not used to train foundation models by default. That is a statement about downstream use, not about retention. Data can be retained for abuse monitoring, debugging, customer history features, and operational logs without ever being added to a training set.

3. Overlooking ZDR scope exclusions

ZDR amendments are typically scoped to a specific list of endpoints or product surfaces. Common excluded surfaces:

  • Stateful conversation APIs (threads, conversations, assistants)
  • Vector stores and file storage
  • Grounding integrations (Google Search/Maps grounding, Bing grounding)
  • Implicit and explicit context caching
  • Realtime/voice APIs that maintain session state by default
  • Web Console / Workbench / Playground UIs

4. Forgetting per-organization enablement

For both OpenAI and Anthropic, ZDR is enabled per organization, by sales approval, and does not automatically transfer to new organizations created under the same account. A large enterprise with multiple business units may be operating partially under ZDR and partially without it, depending on which org IDs are in use.

5. Assuming third-party deployments inherit vendor ZDR

When Anthropic models run on AWS Bedrock or Google Vertex AI, Anthropic's ZDR commitments do not apply—the cloud platform's data-handling terms govern. The same is true for OpenAI models accessed via Azure OpenAI Service: Microsoft's DPA and Modified Abuse Monitoring process apply, not OpenAI's ZDR amendment.

6. Skipping the Workspace/Tenant retention policy

For Gemini in Workspace and Microsoft 365 Copilot, the practical retention footprint is governed largely by your own tenant configuration—Workspace Vault, Microsoft Purview, or equivalent. A misconfigured tenant retention policy is a more common cause of large retention surfaces than the vendor's defaults.

A Practical Procurement Checklist

  1. Identify every surface. Map exactly which products and endpoints your organization uses (chat UI, API, IDE plugin, Bedrock/Vertex/Foundry endpoints, embedding APIs).
  2. Sign the DPA covering each surface, and confirm sub-processor lists, data residency commitments, and breach-notification timelines fit your obligations.
  3. Request a ZDR amendment for every API and surface that is eligible. For OpenAI and Anthropic this means engaging sales and accepting the eligibility review. For Google, configure ZDR per project. For Azure OpenAI, apply for Modified Abuse Monitoring.
  4. Document scope explicitly. List the endpoints and product surfaces covered and excluded. Make sure operations and security teams know which features (file uploads, threads, grounding, caching) fall outside the amendment.
  5. Configure retention policies on your side. For Workspace and Microsoft 365 Copilot, set retention windows and deletion rules in Vault / Purview that match your data-classification policy.
  6. Layer on local anonymization for sensitive workflows. CamoText or equivalents add a pre-transmission scrubbing step that survives any future change to the vendor's retention behavior, scope of ZDR coverage, or sub-processor chain. See our deeper comparison in Best Text Anonymizer 2026.
  7. Train end users. Most retention exposure is operational, not contractual—an employee pasting a client roster into an excluded surface will defeat the strongest amendment. Pair the contracts with a short, clear acceptable-use policy.
  8. Audit annually. ZDR scopes shift as new endpoints ship and product features change. Re-read the relevant retention pages and re-confirm your enabled organizations are still configured correctly. Vendor terms can change frequently and the lawyer's obligation under Opinion 2024-5 to monitor them is continuing.

Frequently Asked Questions

Does signing a DPA mean my prompts aren't stored?

No. The DPA limits how the vendor can use your data and what security commitments apply. It typically does not prohibit retention. For non-retention you generally need a separate Zero Data Retention amendment, scoped to specific endpoints.

Is ChatGPT Enterprise zero data retention?

The ChatGPT Enterprise chat product is not ZDR—chat history, memory, and file uploads necessarily persist. The underlying API can be enrolled in ZDR on eligible endpoints with sales approval.

Is Claude Enterprise zero data retention?

The Claude Enterprise chat interface is not ZDR-eligible. ZDR is available for Claude Code on Claude for Enterprise when enabled per organization, and for the Messages and Token Counting APIs when called with commercial organization API keys.

Is Microsoft 365 Copilot zero data retention?

No. Microsoft markets "Enterprise Data Protection," which provides DPA-grade processor commitments and tenant-managed retention via Microsoft Purview. It is a controlled-retention model. Azure OpenAI Service offers a Modified Abuse Monitoring track that is the closest analog to ZDR for the API.

What's the cheapest way to get a ZDR posture?

Building on the consumer-grade APIs of OpenAI, Anthropic, or Google with a ZDR amendment is usually the lowest-cost path because pricing is per-token rather than per-seat. The tradeoff is that you must provide the chat surface and admin tooling yourself.

Do I still need anonymization if I have ZDR?

For high-sensitivity workflows—client privilege, PHI, MNPI, classified data—yes. ZDR is a contractual control that addresses storage at rest; anonymization is a structural control that addresses what crosses the trust boundary in the first place. Together they provide defense in depth across both layers.

Further Reading