Cohere · Cohere Enterprise Data Commitments · View original document ↗

No Model Training on Customer Data Without Opt-In

High severity Medium confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Cohere Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Cohere states that it will not use the data you send through its platform (such as prompts and documents) to train its AI models unless you specifically agree to allow it.

This analysis describes what Cohere's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This provision directly addresses a common concern for enterprise customers deploying AI: whether proprietary business data submitted as prompts or documents could be incorporated into shared model training. The document states this requires explicit opt-in rather than an opt-out.

Interpretive note: The document is a commitments page rather than a binding contract; enforceability depends on whether these commitments are incorporated into the executed master service agreement.

Consumer impact (what this means for users)

Enterprise customers can submit prompts, completions, and documents to Cohere's platform with the assurance, per this document, that such data will not be used to train Cohere's models unless the customer has expressly opted in. The practical protection depends on whether this commitment is enforceable under the executed service agreement.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Contact Cohere's enterprise team to confirm your data processing terms and verify that the no-training commitment is reflected in your executed agreement. Request written confirmation of opt-in status for model training.

How other platforms handle this

Writer Medium

Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.

Ideogram Medium

We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.

Windsurf Medium

We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...

See all platforms with this clause type →

Monitoring

Cohere has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Cohere does not use customer data to train its models. Enterprise customers must explicitly opt in for their data to be used for any model training purposes.

— Excerpt from Cohere's Cohere Enterprise Data Commitments

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

(1) REGULATORY LANDSCAPE: This provision engages GDPR Article 28 processor obligations, which require that a data processor act only on documented instructions from the controller. If enterprise customer data includes personal data, using it for model training without a documented legal basis and controller instruction would implicate GDPR obligations. The FTC Act's prohibition on unfair or deceptive practices is also relevant if commitments made on this page are not operationally implemented. Relevant enforcement authorities include EU data protection authorities, the UK ICO, and the FTC. (2) GOVERNANCE EXPOSURE: High. The commitment not to use customer data for model training without opt-in is a foundational assurance in enterprise AI procurement. If this commitment is not reflected in the executed master service agreement or data processing addendum, its enforceability is uncertain. Organizations with strict data governance policies should confirm this commitment is contractually binding. (3) JURISDICTION FLAGS: EU and EEA customers face heightened exposure because GDPR Article 28 requires that processing restrictions be documented in a binding data processing agreement, not merely disclosed on a commitments page. California enterprise customers should assess whether CCPA processor obligations align with this commitment. Regulated sectors (healthcare, financial services) should evaluate against sector-specific data handling requirements. (4) CONTRACT AND VENDOR IMPLICATIONS: Procurement teams should confirm that the no-training commitment is expressly included in the master service agreement or a separately executed data processing addendum. A commitments page that is not incorporated by reference into the contract may not constitute a binding obligation. Vendor risk assessments should document this confirmation. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should audit whether the opt-in mechanism for model training is clearly defined, documented, and auditable. Data mapping exercises should classify customer prompt and completion data and verify that processing is restricted to the stated permitted purposes. Any changes to this commitment should trigger contract amendment review and potentially customer notification obligations.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive trade practices; representations made in a public commitments document about data use practices are relevant to this authority.
    File a complaint →

Applicable regulations

EU AI Act
European Union
California AB 2013 AI Training Data Transparency
US-CA
Colorado AI Act
US-CO
EU AI Act - High Risk Provisions
EU
GDPR
European Union
Texas AI Act
Texas, USA
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
Cohere Enterprise Data Commitments
Entity
Cohere
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 12, 2026
Record ID
CA-P-011328
Document ID
CA-D-00767
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
8628f8f463d454e1098b82322c6192f389628876bf5850be1d3d46adf29654e7
Analysis generated
May 11, 2026 12:20 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Cohere
Document: Cohere Enterprise Data Commitments
Record ID: CA-P-011328
Captured: 2026-05-11 12:20:05 UTC
SHA-256: 8628f8f463d454e1…
URL: https://conductatlas.com/platform/cohere/cohere-enterprise-data-commitments/no-model-training-on-customer-data-without-opt-in/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Cohere's No Model Training on Customer Data Without Opt-In clause do?

This provision directly addresses a common concern for enterprise customers deploying AI: whether proprietary business data submitted as prompts or documents could be incorporated into shared model training. The document states this requires explicit opt-in rather than an opt-out.

How does this clause affect you?

Enterprise customers can submit prompts, completions, and documents to Cohere's platform with the assurance, per this document, that such data will not be used to train Cohere's models unless the customer has expressly opted in. The practical protection depends on whether this commitment is enforceable under the executed service agreement.

Is ConductAtlas affiliated with Cohere?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cohere.