Anthropic · Anthropic API Usage Policy · View original document ↗

Children's Safety and CSAM Reporting Obligation

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Anthropic Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Anthropic absolutely prohibits any content sexualizing or harming minors, and has committed to reporting detected CSAM to law enforcement authorities. This applies to AI-generated content and fictional settings, not just real-world material.

This analysis describes what Anthropic's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

The explicit reporting commitment to authorities is one of the strongest enforcement commitments in the policy, and the definition of minor as under-18 regardless of local jurisdiction creates a globally uniform standard that may be stricter than some local laws.

Recent Activity

This document changed recently

High Feb 27, 2026

Defense contractors and federal agencies using Claude must find alternatives. Enterprise customers with defense-adjacent business face compliance risk.

Consumer impact (what this means for users)

Any attempt to generate CSAM or related content through Anthropic's products, including fictional or AI-generated material, will result in reporting to law enforcement. The under-18 definition applies globally regardless of what local law says about age of consent.

How other platforms handle this

T-Mobile Medium

Our services are not directed to children under the age of 13. We do not knowingly collect personal information from children under the age of 13 without parental consent. If we become aware that we have collected personal information from a child under the age of 13 without parental consent, we wil...

McDonald's Medium

Our online services are not directed to children under the age of 13, and we do not knowingly collect personal information from children under 13. If we learn that we have collected personal information from a child under 13, we will delete that information as quickly as possible.

Figma Medium

Our Services are not directed to children under the age of 13. We do not knowingly collect personal information from children under 13. If we learn that we have collected personal information from a child under 13 without parental consent, we will take steps to delete such information. In some juris...

See all platforms with this clause type →

Monitoring

Anthropic has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Create, distribute, or promote child sexual abuse material ("CSAM"), including AI-generated CSAM [...] Note: We define a minor or child to be any individual under the age of 18 years old, regardless of jurisdiction. When we detect CSAM (including AI-generated CSAM), or coercion or enticement of a minor to engage in sexual activities, we will report to relevant authorities.

— Excerpt from Anthropic's Anthropic API Usage Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

(1) REGULATORY LANDSCAPE: This provision directly implicates 18 U.S.C. Section 2258A (NCMEC CyberTipline reporting obligations), COPPA for platforms serving minors, and the PROTECT Act. Internationally, it engages the EU's Digital Services Act obligations for illegal content reporting and GDPR where minor data is processed. Relevant enforcement authorities include the Department of Justice, NCMEC, and international equivalents. (2) GOVERNANCE EXPOSURE: High. The commitment to report detected CSAM to relevant authorities creates a mandatory reporting obligation that, if not operationalized with sufficient detection infrastructure, could expose Anthropic to regulatory risk. For operators deploying Claude in consumer-facing products that may reach minors, the downstream obligation to prevent such use creates compliance exposure at the operator level as well. (3) JURISDICTION FLAGS: The universal under-18 definition regardless of jurisdiction is operationally significant for EU, UK, and jurisdictions where age of consent differs. Operators in the EU must evaluate obligations under the Digital Services Act and GDPR Article 8 for processing children's data. The global uniform standard may create compliance complexity for operators in jurisdictions with different legal age thresholds. (4) CONTRACT AND VENDOR IMPLICATIONS: Operators serving consumer audiences must implement age verification or access controls sufficient to satisfy the policy's minor protection requirements. The reporting commitment means operators should assess their own incident response and law enforcement cooperation procedures, as a violation detected by Anthropic in a third-party deployment could trigger reporting that affects the operator's own legal exposure. (5) COMPLIANCE CONSIDERATIONS: Operators should audit whether their own terms of service and content moderation policies align with the under-18 global definition and CSAM prohibition. Age-gating mechanisms, parental consent flows, and content filtering should be reviewed. Any product serving minors should evaluate compliance with COPPA, the UK Age Appropriate Design Code, and equivalent frameworks.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has authority over COPPA compliance and child safety practices in consumer-facing digital services
    File a complaint →

Applicable regulations

EU AI Act
European Union
BIPA
Illinois, USA
CCPA/CPRA
California, USA
Colorado AI Act
US-CO
Connecticut Data Privacy Act Amendments
US-CT
CAN-SPAM
United States Federal
EU AI Act - High Risk Provisions
EU
FTC Act Section 5
United States Federal
GDPR
European Union
Indiana Consumer Data Protection Act
US-IN
Kentucky Consumer Data Protection Act
US-KY
UK GDPR
United Kingdom
Universal Opt-Out Mechanism Expansion 2026
US

Provision details

Document information
Document
Anthropic API Usage Policy
Entity
Anthropic
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 11, 2026
Record ID
CA-P-009963
Document ID
CA-D-00013
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
60e693438d9f7f47deb8f3bfb819343e26b5fe0eb90d56280568f1dd95ae660f
Analysis generated
May 11, 2026 00:39 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Anthropic
Document: Anthropic API Usage Policy
Record ID: CA-P-009963
Captured: 2026-05-11 00:39:26 UTC
SHA-256: 60e693438d9f7f47…
URL: https://conductatlas.com/platform/anthropic/anthropic-api-usage-policy/childrens-safety-and-csam-reporting-obligation/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Anthropic's Children's Safety and CSAM Reporting Obligation clause do?

The explicit reporting commitment to authorities is one of the strongest enforcement commitments in the policy, and the definition of minor as under-18 regardless of local jurisdiction creates a globally uniform standard that may be stricter than some local laws.

How does this clause affect you?

Any attempt to generate CSAM or related content through Anthropic's products, including fictional or AI-generated material, will result in reporting to law enforcement. The under-18 definition applies globally regardless of what local law says about age of consent.

Is ConductAtlas affiliated with Anthropic?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Anthropic.