OpenAI strictly prohibits any use of its tools to generate sexual content involving minors or content that could be used to groom, exploit, or harm children in any way.
Any attempt to generate CSAM or minor-related exploitation content using OpenAI tools will result in immediate account termination and mandatory referral to federal law enforcement — there is no appeals pathway for this category of violation.
Cross-platform context
See how other platforms handle Age Restrictions and Minor Protections and similar clauses.
Compare across platforms →This is a legally required prohibition with federal criminal law backing — violations are not just a policy matter but a federal crime, and OpenAI is legally required to report known CSAM to the National Center for Missing and Exploited Children.
REGULATORY FRAMEWORK: This provision is directly mandated by 18 U.S.C. § 2258A (mandatory NCMEC reporting for electronic service providers with actual knowledge of CSAM), 18 U.S.C. § 2256 (federal definition of child pornography), and COPPA (15 U.S.C. § 6501 et seq.) where minors' data is implicated. The DOJ Child Exploitation and Obscenity Section (CEOS) and FBI Crimes Against Children unit are the primary enforcement authorities. The EU's Digital Services Act (DSA) Article 34 requires major platforms to assess and mitigate CSAM risks.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.