Using Cohere's AI to generate non-consensual intimate imagery (deepfake sexual content) is strictly forbidden under this policy.
This provision directly protects individuals from having Cohere's AI used to create sexual imagery of them without their consent — a specific safety protection for potential victims of AI-generated NCII.
Cross-platform context
See how other platforms handle Prohibition on Non-Consensual Intimate Imagery (NCII) and similar clauses.
Compare across platforms →NCII generated by AI is a rapidly growing harm affecting real individuals; Cohere's explicit prohibition signals regulatory awareness of emerging state and federal NCII laws and reduces the risk that its platform is used to victimize individuals.
(1) REGULATORY FRAMEWORK: NCII generation implicates the SHIELD Act (proposed federal legislation) and enacted state statutes including California AB 602/AB 730, Texas HB 4337, and Virginia Code §18.2-386.2, with enforcement by state AGs and local prosecutors; the FTC's unfairness authority under Section 5 extends to platforms that enable NCII generation against consumers. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.