Anthropic will report any detected child sexual abuse material — including AI-generated images or text — to law enforcement, and defines anyone under 18 as a minor regardless of local age-of-consent laws.
Any attempt to generate CSAM or sexually exploit minors through Claude will be reported to authorities, and the universal age-18 definition means no jurisdictional loophole applies.
Cross-platform context
See how other platforms handle CSAM Detection and Mandatory Reporting and similar clauses.
Compare across platforms →This provision means Anthropic actively monitors for CSAM and will involve law enforcement, creating real legal consequences for users who attempt to generate such content.
(1) REGULATORY FRAMEWORK: This provision directly implicates 18 U.S.C. § 2258A (mandatory CyberTipline reporting to NCMEC for electronic service providers), the PROTECT Act of 2003, and the EARN IT Act framework. Internationally, it engages the UK Online Safety Act 2023 (illegal content duties) and EU CSA Regulation proposals. Primary enforcement authorities are the DOJ/FBI, NCMEC, and equivalent national bodies. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.