Anthropic absolutely prohibits any content sexualizing or harming minors, and has committed to reporting detected CSAM to law enforcement authorities. This applies to AI-generated content and fictional settings, not just real-world material.
This analysis describes what Anthropic's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
The explicit reporting commitment to authorities is one of the strongest enforcement commitments in the policy, and the definition of minor as under-18 regardless of local jurisdiction creates a globally uniform standard that may be stricter than some local laws.
Defense contractors and federal agencies using Claude must find alternatives. Enterprise customers with defense-adjacent business face compliance risk.
Any attempt to generate CSAM or related content through Anthropic's products, including fictional or AI-generated material, will result in reporting to law enforcement. The under-18 definition applies globally regardless of what local law says about age of consent.
How other platforms handle this
Our services are not directed to children under the age of 13. We do not knowingly collect personal information from children under the age of 13 without parental consent. If we become aware that we have collected personal information from a child under the age of 13 without parental consent, we wil...
Our online services are not directed to children under the age of 13, and we do not knowingly collect personal information from children under 13. If we learn that we have collected personal information from a child under 13, we will delete that information as quickly as possible.
Our Services are not directed to children under the age of 13. We do not knowingly collect personal information from children under 13. If we learn that we have collected personal information from a child under 13 without parental consent, we will take steps to delete such information. In some juris...
Monitoring
Anthropic has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Create, distribute, or promote child sexual abuse material ("CSAM"), including AI-generated CSAM [...] Note: We define a minor or child to be any individual under the age of 18 years old, regardless of jurisdiction. When we detect CSAM (including AI-generated CSAM), or coercion or enticement of a minor to engage in sexual activities, we will report to relevant authorities.— Excerpt from Anthropic's Anthropic API Usage Policy
(1) REGULATORY LANDSCAPE: This provision directly implicates 18 U.S.C. Section 2258A (NCMEC CyberTipline reporting obligations), COPPA for platforms serving minors, and the PROTECT Act. Internationally, it engages the EU's Digital Services Act obligations for illegal content reporting and GDPR where minor data is processed. Relevant enforcement authorities include the Department of Justice, NCMEC, and international equivalents. (2) GOVERNANCE EXPOSURE: High. The commitment to report detected CSAM to relevant authorities creates a mandatory reporting obligation that, if not operationalized with sufficient detection infrastructure, could expose Anthropic to regulatory risk. For operators deploying Claude in consumer-facing products that may reach minors, the downstream obligation to prevent such use creates compliance exposure at the operator level as well. (3) JURISDICTION FLAGS: The universal under-18 definition regardless of jurisdiction is operationally significant for EU, UK, and jurisdictions where age of consent differs. Operators in the EU must evaluate obligations under the Digital Services Act and GDPR Article 8 for processing children's data. The global uniform standard may create compliance complexity for operators in jurisdictions with different legal age thresholds. (4) CONTRACT AND VENDOR IMPLICATIONS: Operators serving consumer audiences must implement age verification or access controls sufficient to satisfy the policy's minor protection requirements. The reporting commitment means operators should assess their own incident response and law enforcement cooperation procedures, as a violation detected by Anthropic in a third-party deployment could trigger reporting that affects the operator's own legal exposure. (5) COMPLIANCE CONSIDERATIONS: Operators should audit whether their own terms of service and content moderation policies align with the under-18 global definition and CSAM prohibition. Age-gating mechanisms, parental consent flows, and content filtering should be reviewed. Any product serving minors should evaluate compliance with COPPA, the UK Age Appropriate Design Code, and equivalent frameworks.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
The explicit reporting commitment to authorities is one of the strongest enforcement commitments in the policy, and the definition of minor as under-18 regardless of local jurisdiction creates a globally uniform standard that may be stricter than some local laws.
Any attempt to generate CSAM or related content through Anthropic's products, including fictional or AI-generated material, will result in reporting to law enforcement. The under-18 definition applies globally regardless of what local law says about age of consent.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Anthropic.