Character.AI · Character.ai Community Guidelines · View original document ↗

Prohibited Content: Sexual Content and Child Safety

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Character.AI Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Character.AI prohibits all sexually explicit content, child sexual abuse material, grooming behavior, sexual extortion, pornography, and nudity on the platform.

This analysis describes what Character.AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

The explicit prohibition on child exploitation material and grooming reflects mandatory legal obligations under federal law and directly implicates the platform's CSAM reporting duties to the National Center for Missing and Exploited Children.

Consumer impact (what this means for users)

Any user who creates or shares content in these prohibited categories faces account enforcement and potential law enforcement referral, and the platform is legally required to report CSAM to federal authorities regardless of its community guidelines.

How other platforms handle this

Runway Medium

You may not use Runway's tools to create content that promotes, glorifies, or facilitates acts of terrorism, mass violence, or genocide, or that could be used to provide material support to individuals or organizations engaged in such activities.

Perplexity AI Medium

You may not use the Services to attempt to circumvent, disable, or otherwise interfere with safety-related features of the Services, including features that prevent or restrict the generation of certain types of content.

Midjourney Medium

Do not generate images for political campaigns or to try to influence the outcome of an election. Do not generate images to spread misinformation or disinformation. Do not generate images to attempt to or to actually deceive or defraud anyone. Do not intentionally mislead recipients of generated ima...

See all platforms with this clause type →

Monitoring

Character.AI has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Respect Sexual Content Standards: Keep things appropriate. Illegal sexual content, child exploitation or abuse imagery, grooming, sexual extortion, pornographic content, and nudity are prohibited.

— Excerpt from Character.AI's Character.ai Community Guidelines

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: Federal law (18 U.S.C. Section 2258A) requires electronic service providers to report apparent violations involving child sexual abuse material to NCMEC, and this is a mandatory legal obligation independent of the platform's community guidelines. COPPA and the PROTECT Act are also relevant. The DOJ and FBI are primary federal enforcement authorities for CSAM-related violations. Grooming and sexual extortion prohibitions may also engage state criminal statutes. GOVERNANCE EXPOSURE: High. CSAM-related obligations are non-discretionary federal legal requirements. Failure to detect and report CSAM to NCMEC carries serious federal criminal exposure. The platform's use of automated moderation and human review for this category is operationally significant and its adequacy will be a key factor in any regulatory review. JURISDICTION FLAGS: These prohibitions apply globally, but enforcement obligations and the legal definition of prohibited content vary by jurisdiction. EU jurisdictions have additional child protection obligations under the proposed EU Child Sexual Abuse Regulation. UK law imposes obligations under the Online Safety Act. CONTRACT AND VENDOR IMPLICATIONS: Organizations deploying Character.AI in educational or youth-serving contexts should verify that CSAM detection and reporting infrastructure is operational and documented. Vendor assessments should include review of NCMEC reporting practices and automated detection system accuracy. COMPLIANCE CONSIDERATIONS: Compliance teams should confirm that NCMEC CyberTipline reporting workflows are in place and tested, and that human review escalation procedures for CSAM detection are documented. Legal teams should assess whether AI-generated content depicting minors in prohibited contexts triggers the same reporting obligations as user-uploaded CSAM under current federal guidance.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over child protection online practices and COPPA compliance relevant to minors on the platform
    File a complaint →

Applicable regulations

CFAA
United States Federal
DMCA
United States Federal

Provision details

Document information
Document
Character.ai Community Guidelines
Entity
Character.AI
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 11, 2026
Record ID
CA-P-010614
Document ID
CA-D-00780
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
ec0a9230a377aef5831a06c6ed9e3bbc7b54344595a80c04401a4ca4fe5a8d48
Analysis generated
May 11, 2026 12:24 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Character.AI
Document: Character.ai Community Guidelines
Record ID: CA-P-010614
Captured: 2026-05-11 12:24:11 UTC
SHA-256: ec0a9230a377aef5…
URL: https://conductatlas.com/platform/characterai/characterai-community-guidelines/prohibited-content-sexual-content-and-child-safety/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Character.AI's Prohibited Content: Sexual Content and Child Safety clause do?

The explicit prohibition on child exploitation material and grooming reflects mandatory legal obligations under federal law and directly implicates the platform's CSAM reporting duties to the National Center for Missing and Exploited Children.

How does this clause affect you?

Any user who creates or shares content in these prohibited categories faces account enforcement and potential law enforcement referral, and the platform is legally required to report CSAM to federal authorities regardless of its community guidelines.

Is ConductAtlas affiliated with Character.AI?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Character.AI.