Character.AI · Character.ai Privacy Policy · View original document ↗

AI Model Training Use of Chat and Voice Data

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Character.AI Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Character.AI uses your chat conversations, voice recordings, and other interaction data to train its AI systems and develop new AI features.

This analysis describes what Character.AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

Users engaging in potentially personal or sensitive conversations with AI characters may not fully appreciate that their messages and voice inputs can become training material for commercial AI models.

Consumer impact (what this means for users)

Your private chat messages and voice recordings may be used to train Character.AI's AI models, meaning the content of your conversations has a use beyond your immediate interaction with the platform.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Visit the Character.AI support portal, submit a data deletion request specifying that you want your chat and voice data removed from training datasets, and follow any identity verification steps required.

How other platforms handle this

Windsurf Medium

We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...

Writer Medium

Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.

Ideogram Medium

We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.

See all platforms with this clause type →

Monitoring

Character.AI has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Analyze, maintain, improve, modify, customize, and measure the Services, including to train our artificial intelligence/machine learning models; Develop new features, algorithms and machine learning models, programs, and services.

— Excerpt from Character.AI's Character.ai Privacy Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision implicates GDPR Articles 6 and 13 (lawful basis and transparency for processing personal data for AI training), GDPR Article 9 where chat content reveals special category data, CCPA's sensitive personal information provisions for voice data and inferred data, and the EU AI Act insofar as training data governance requirements apply to foundation model developers. The FTC and EU data protection authorities are the primary enforcement bodies. The policy does not specify a GDPR lawful basis for AI training use in the base document, deferring to Regional Privacy Disclosures, which may not satisfy standalone transparency obligations under Article 13. GOVERNANCE EXPOSURE: High. The use of personal data including voice recordings and chat communications for AI model training without an explicitly stated lawful basis in the base policy creates material regulatory exposure under GDPR and UK GDPR. Regulators in the EU have scrutinized AI training data practices across multiple platforms, and the inclusion of voice data raises additional sensitivity given its potential treatment as biometric data under certain frameworks. JURISDICTION FLAGS: EU and UK users face heightened exposure given GDPR and UK GDPR requirements for explicit lawful basis documentation and data subject rights around automated processing. California users have CCPA rights regarding sensitive personal information including voice data. Illinois users may have claims under BIPA if voice data is processed in ways that constitute biometric identifier collection. Minor users globally face additional protections under COPPA and the UK Age Appropriate Design Code. CONTRACT AND VENDOR IMPLICATIONS: Vendors and service providers receiving data for model training purposes must be assessed under GDPR Article 28 data processing agreements. If third-party AI infrastructure providers receive raw training data, their sub-processor status and contractual obligations require verification. This provision may also affect enterprise or API customers who integrate Character.AI into their own services. COMPLIANCE CONSIDERATIONS: Compliance teams should document the specific lawful basis claimed for AI training in the Regional Privacy Disclosures and assess whether it aligns with the base policy language. A data mapping exercise should identify which data categories flow into training pipelines, with particular attention to voice data and chat content containing sensitive disclosures. Consent mechanisms for model training opt-out, referenced separately in the policy's navigation as 'About our Model Training,' should be audited for accessibility and effectiveness.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive data practices, including undisclosed or inadequately disclosed use of personal data for AI model training.
    File a complaint →

Applicable regulations

EU AI Act
European Union
Colorado AI Act
US-CO
GDPR
European Union
Texas AI Act
Texas, USA
UK GDPR
United Kingdom

Provision details

Document information
Document
Character.ai Privacy Policy
Entity
Character.AI
Document last updated
May 5, 2026
Tracking information
First tracked
May 8, 2026
Last verified
May 11, 2026
Record ID
CA-P-010330
Document ID
CA-D-00120
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
6ad8585d7de8834f45d45863325899d3602d6584f208eff63eb099fffa024748
Analysis generated
May 8, 2026 14:58 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Character.AI
Document: Character.ai Privacy Policy
Record ID: CA-P-010330
Captured: 2026-05-08 14:58:37 UTC
SHA-256: 6ad8585d7de8834f…
URL: https://conductatlas.com/platform/characterai/characterai-privacy-policy/ai-model-training-use-of-chat-and-voice-data/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Character.AI's AI Model Training Use of Chat and Voice Data clause do?

Users engaging in potentially personal or sensitive conversations with AI characters may not fully appreciate that their messages and voice inputs can become training material for commercial AI models.

How does this clause affect you?

Your private chat messages and voice recordings may be used to train Character.AI's AI models, meaning the content of your conversations has a use beyond your immediate interaction with the platform.

Is ConductAtlas affiliated with Character.AI?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Character.AI.