The policy absolutely prohibits using Cohere's AI to create any sexual content involving children, including AI-generated imagery or text.
This analysis describes what Cohere's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision establishes an unconditional prohibition with no operator-level override permitted, and violation would constitute a breach of the AUP as well as potentially criminal conduct under applicable law in most jurisdictions.
Any application built on Cohere's API that generates CSAM or sexual content involving minors is prohibited, meaning users of such applications cannot invoke Cohere's platform to produce this material, and operators cannot configure or permit this use.
Cross-platform context
See how other platforms handle Prohibited Use: CSAM and Sexual Content Involving Minors and similar clauses.
Compare across platforms →Monitoring
Cohere has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Do not use Cohere's services to generate child sexual abuse material (CSAM) or any sexual content involving minors.— Excerpt from Cohere's Cohere Responsible Use Policy
REGULATORY LANDSCAPE: This provision directly engages US federal law under 18 U.S.C. Chapter 110 (sexual exploitation of minors) and equivalent statutes in most jurisdictions globally. The National Center for Missing and Exploited Children (NCMEC) operates a CyberTipline with mandatory reporting obligations for electronic service providers under US law. The EU's proposed CSAM regulation and existing Directive 2011/93/EU also impose obligations on service providers. The FTC may also have jurisdiction over deceptive or unfair practices related to child safety. GOVERNANCE EXPOSURE: High. This prohibition is legally mandated in most jurisdictions independent of contractual terms, and failure to implement effective technical and policy controls creates severe criminal and civil liability. The clause is non-negotiable and cannot be modified by operator configuration. JURISDICTION FLAGS: Universal applicability. No jurisdiction permits CSAM. US, EU, UK, Canada, and Australia all impose mandatory reporting and criminal liability. Organizations operating globally face multi-jurisdictional obligations that the AUP alone does not satisfy. CONTRACT AND VENDOR IMPLICATIONS: Procurement teams should treat this as a baseline compliance requirement in any vendor agreement involving AI content generation. B2B contracts should include explicit representations and warranties from API users that their platforms implement content filtering and detection measures consistent with legal obligations. COMPLIANCE CONSIDERATIONS: Organizations should implement and document technical controls to prevent CSAM generation, establish incident response procedures for detection and mandatory reporting, and conduct periodic audits of content moderation systems. Legal counsel should confirm whether the organization qualifies as an electronic service provider subject to mandatory CyberTipline reporting under 18 U.S.C. 2258A.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision establishes an unconditional prohibition with no operator-level override permitted, and violation would constitute a breach of the AUP as well as potentially criminal conduct under applicable law in most jurisdictions.
Any application built on Cohere's API that generates CSAM or sexual content involving minors is prohibited, meaning users of such applications cannot invoke Cohere's platform to produce this material, and operators cannot configure or permit this use.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cohere.