Teenagers on Character.AI use a different, more restricted version of the AI that filters out sensitive content and limits which characters they can interact with compared to adult users.
This analysis describes what Character.AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision discloses a material architectural difference in how minors experience the platform, which has direct implications for child safety compliance under COPPA and analogous state laws.
Minors accessing Character.AI receive a filtered AI experience with access to fewer characters and more conservative content controls, which limits certain platform capabilities but is presented as a protective measure for users under 18.
How other platforms handle this
We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...
We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.
Only models with a post-mitigation score of "medium" or below can be deployed. Only models with a post-mitigation score of "high" or below can be developed further.
Monitoring
Character.AI has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Users under 18 years old interact with an age-appropriate model specifically designed to reduce the likelihood of exposure to sensitive or suggestive content. Our under-18 model has additional and more conservative classifiers than the model for our adult users so we can enforce our content policies and filter out sensitive content from our model's responses. While hundreds of millions of user-created Characters exist on the platform, teen users are only able to access a narrower set of Characters.— Excerpt from Character.AI's Character.ai Community Guidelines
REGULATORY LANDSCAPE: This provision directly engages COPPA, which governs online services directed to or with actual knowledge of users under 13, and emerging state child online safety laws including California's Age-Appropriate Design Code. The FTC is the primary federal enforcement authority for COPPA. The EU's GDPR includes specific protections for minors, and the UK's Age Appropriate Design Code (Children's Code) imposes design obligations for services likely to be accessed by under-18 users. GOVERNANCE EXPOSURE: High. The disclosure of a minor-specific AI model and character access restrictions signals that Character.AI has actual knowledge of minor users on the platform, which triggers COPPA obligations regardless of whether the platform is primarily directed to children. The adequacy of the technical controls described here will be a key factor in any regulatory review of COPPA compliance. JURISDICTION FLAGS: California's Age-Appropriate Design Code, Texas, and several other states have enacted or are considering child online safety legislation that may impose design, data minimization, and parental consent requirements beyond federal COPPA standards. EU and UK users under 18 are subject to heightened GDPR and UK Children's Code obligations. Compliance exposure is elevated in all jurisdictions where the platform has a meaningful minor user base. CONTRACT AND VENDOR IMPLICATIONS: Organizations deploying Character.AI in educational or youth-facing contexts should verify whether the under-18 model and access controls satisfy institutional obligations under FERPA, state student privacy laws, and applicable content filtering standards. Vendor assessments should confirm the technical adequacy and update cadence of the minor-specific classifiers. COMPLIANCE CONSIDERATIONS: Compliance teams should evaluate age verification mechanisms to ensure that minor users are reliably identified and served the restricted model, and assess whether the Parental Insights tool satisfies COPPA's verifiable parental consent requirements. Data mapping should account for the separately processed data associated with minor user interactions.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision discloses a material architectural difference in how minors experience the platform, which has direct implications for child safety compliance under COPPA and analogous state laws.
Minors accessing Character.AI receive a filtered AI experience with access to fewer characters and more conservative content controls, which limits certain platform capabilities but is presented as a protective measure for users under 18.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Character.AI.