Cohere · Cohere Responsible Use Policy · View original document ↗

Prohibited Use: CSAM and Sexual Content Involving Minors

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Cohere Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

The policy absolutely prohibits using Cohere's AI to create any sexual content involving children, including AI-generated imagery or text.

This analysis describes what Cohere's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This provision establishes an unconditional prohibition with no operator-level override permitted, and violation would constitute a breach of the AUP as well as potentially criminal conduct under applicable law in most jurisdictions.

Consumer impact (what this means for users)

Any application built on Cohere's API that generates CSAM or sexual content involving minors is prohibited, meaning users of such applications cannot invoke Cohere's platform to produce this material, and operators cannot configure or permit this use.

Cross-platform context

See how other platforms handle Prohibited Use: CSAM and Sexual Content Involving Minors and similar clauses.

Compare across platforms →

Monitoring

Cohere has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Do not use Cohere's services to generate child sexual abuse material (CSAM) or any sexual content involving minors.

— Excerpt from Cohere's Cohere Responsible Use Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision directly engages US federal law under 18 U.S.C. Chapter 110 (sexual exploitation of minors) and equivalent statutes in most jurisdictions globally. The National Center for Missing and Exploited Children (NCMEC) operates a CyberTipline with mandatory reporting obligations for electronic service providers under US law. The EU's proposed CSAM regulation and existing Directive 2011/93/EU also impose obligations on service providers. The FTC may also have jurisdiction over deceptive or unfair practices related to child safety. GOVERNANCE EXPOSURE: High. This prohibition is legally mandated in most jurisdictions independent of contractual terms, and failure to implement effective technical and policy controls creates severe criminal and civil liability. The clause is non-negotiable and cannot be modified by operator configuration. JURISDICTION FLAGS: Universal applicability. No jurisdiction permits CSAM. US, EU, UK, Canada, and Australia all impose mandatory reporting and criminal liability. Organizations operating globally face multi-jurisdictional obligations that the AUP alone does not satisfy. CONTRACT AND VENDOR IMPLICATIONS: Procurement teams should treat this as a baseline compliance requirement in any vendor agreement involving AI content generation. B2B contracts should include explicit representations and warranties from API users that their platforms implement content filtering and detection measures consistent with legal obligations. COMPLIANCE CONSIDERATIONS: Organizations should implement and document technical controls to prevent CSAM generation, establish incident response procedures for detection and mandatory reporting, and conduct periodic audits of content moderation systems. Legal counsel should confirm whether the organization qualifies as an electronic service provider subject to mandatory CyberTipline reporting under 18 U.S.C. 2258A.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive practices including failures by platforms to implement child safety measures
    File a complaint →

Provision details

Document information
Document
Cohere Responsible Use Policy
Entity
Cohere
Document last updated
May 12, 2026
Tracking information
First tracked
May 12, 2026
Last verified
May 12, 2026
Record ID
CA-P-011990
Document ID
CA-D-00830
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
525a1023544d802d0b69aead1ed2f42d817072b058c572837c434d0b14e12fa2
Analysis generated
May 12, 2026 16:53 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Cohere
Document: Cohere Responsible Use Policy
Record ID: CA-P-011990
Captured: 2026-05-12 16:53:50 UTC
SHA-256: 525a1023544d802d…
URL: https://conductatlas.com/platform/cohere/cohere-responsible-use-policy/prohibited-use-csam-and-sexual-content-involving-minors/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Cohere's Prohibited Use: CSAM and Sexual Content Involving Minors clause do?

This provision establishes an unconditional prohibition with no operator-level override permitted, and violation would constitute a breach of the AUP as well as potentially criminal conduct under applicable law in most jurisdictions.

How does this clause affect you?

Any application built on Cohere's API that generates CSAM or sexual content involving minors is prohibited, meaning users of such applications cannot invoke Cohere's platform to produce this material, and operators cannot configure or permit this use.

Is ConductAtlas affiliated with Cohere?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cohere.