GPT-4o has built-in blocks against generating child sexual abuse material, but operators running adult platforms can unlock explicit sexual content for their users — though not involving minors.
If you or your children use any app built on GPT-4o, the platform operator may have enabled explicit adult content — and OpenAI's CSAM block is the primary protection for minors, with no mention of robust age verification requirements for operators who unlock adult content.
Cross-platform context
See how other platforms handle CSAM and Sexual Content Safeguards and similar clauses.
Compare across platforms →The architecture allowing operators to unlock adult content, combined with acknowledged limitations in age verification, creates a structural risk that explicit content may reach minors through third-party platforms.
(1) REGULATORY FRAMEWORK: This provision implicates COPPA (15 U.S.C. §6501 et seq.) for deployments accessible to children under 13, the PROTECT Act (18 U.S.C. §2252A) for CSAM, the EU AI Act Article 5 (prohibited AI practices causing harm to minors), and California's Age-Appropriate Design Code (AB 2273). The FTC enforces COPPA; DOJ enforces PROTECT Act violations; the EU AI Office enforces AI Act prohibitions. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.