Cohere · Cohere Usage Policy · View original document ↗

Prohibition on CSAM

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Cohere Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Generating child sexual abuse material using Cohere's models or API is a strict and absolute prohibition under this policy.

This analysis describes what Cohere's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This is a categorical prohibition with no exceptions; any application or output that generates CSAM would constitute an immediate and severe breach of the Terms of Service and would independently constitute criminal conduct in virtually all jurisdictions.

Consumer impact (what this means for users)

This prohibition directly protects minors from harm by categorically barring any use of Cohere's models to produce child sexual abuse material.

How other platforms handle this

Runway Medium

You may not use Runway's tools to create content that promotes, glorifies, or facilitates acts of terrorism, mass violence, or genocide, or that could be used to provide material support to individuals or organizations engaged in such activities.

Mistral AI Medium

Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...

Perplexity AI Medium

You may not use the Services to attempt to circumvent, disable, or otherwise interfere with safety-related features of the Services, including features that prevent or restrict the generation of certain types of content.

See all platforms with this clause type →

Monitoring

Cohere has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Certain use cases, such as violence, hate speech, fraud, and privacy violations, are strictly prohibited. [The policy identifies generation of child sexual abuse material as a categorical prohibited use.]

— Excerpt from Cohere's Cohere Usage Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

(1) REGULATORY LANDSCAPE: Generation of CSAM is a federal crime under 18 U.S.C. Chapter 110 in the United States and is criminalized in all EU member states and most jurisdictions globally. The NCMEC CyberTipline is the primary reporting mechanism in the US. This provision also engages COPPA and the EU's Directive on combating the sexual abuse and sexual exploitation of children. Enforcement authority includes the DOJ, FBI, Homeland Security Investigations, and international equivalents. (2) GOVERNANCE EXPOSURE: High. This is the most unambiguous prohibition in the document. Any organization whose deployment results in CSAM generation faces not only API termination but potential criminal and civil liability independent of this policy. (3) JURISDICTION FLAGS: This prohibition applies globally without jurisdictional variation; CSAM is criminalized in all major jurisdictions. (4) CONTRACT AND VENDOR IMPLICATIONS: Procurement teams and platform operators deploying Cohere in contexts where users may attempt to generate such content should implement independent content moderation layers and reporting mechanisms. Reliance solely on Cohere's model-level safeguards is not a sufficient compliance posture for platforms with significant user-generated content. (5) COMPLIANCE CONSIDERATIONS: Organizations deploying Cohere in consumer-facing or user-generated content contexts should implement CSAM detection tools, establish NCMEC reporting protocols, and include explicit CSAM prohibitions in their own end-user terms of service.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has consumer protection authority over platforms that fail to implement adequate safeguards against illegal content generation.
    File a complaint →

Applicable regulations

CFAA
United States Federal
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
Cohere Usage Policy
Entity
Cohere
Document last updated
May 5, 2026
Tracking information
First tracked
April 30, 2026
Last verified
May 12, 2026
Record ID
CA-P-011005
Document ID
CA-D-00442
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
2937f674a79ab03784eab9a8774b7c807068d6f695cd81b3eb7bc9419a338c76
Analysis generated
April 30, 2026 06:46 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Cohere
Document: Cohere Usage Policy
Record ID: CA-P-011005
Captured: 2026-04-30 06:46:20 UTC
SHA-256: 2937f674a79ab037…
URL: https://conductatlas.com/platform/cohere/cohere-usage-policy/prohibition-on-csam/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Cohere's Prohibition on CSAM clause do?

This is a categorical prohibition with no exceptions; any application or output that generates CSAM would constitute an immediate and severe breach of the Terms of Service and would independently constitute criminal conduct in virtually all jurisdictions.

How does this clause affect you?

This prohibition directly protects minors from harm by categorically barring any use of Cohere's models to produce child sexual abuse material.

Is ConductAtlas affiliated with Cohere?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cohere.