ElevenLabs · ElevenLabs Privacy Policy · View original document ↗

Voice Data Use for AI Model Training

High severity Medium confidence Inferredfromcontext Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for ElevenLabs Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

ElevenLabs may use voice recordings you submit and voice models you create to train and improve its AI systems, not just to deliver the service you requested.

This analysis describes what ElevenLabs's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

Your voice is a biometric identifier, and its use for AI training extends beyond the immediate service you signed up for, with implications for data retention and potential exposure across future AI model versions.

Interpretive note: The exact verbatim language from the document was not fully extractable from the HTML source provided; the excerpt reflects the policy's disclosed practice as inferable from the document structure and content. Application of biometric statutes varies by jurisdiction and depends on how 'voiceprint' is defined under each state's law.

Consumer impact (what this means for users)

If you upload voice recordings or create a cloned voice on ElevenLabs, that voice data may be retained and used to improve the platform's AI, meaning your voice could remain embedded in the company's systems beyond the life of your account.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Email privacy@elevenlabs.io requesting deletion of your voice recordings and any cloned voice models associated with your account. Specify that you want your data excluded from AI model training.

How other platforms handle this

Windsurf Medium

We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...

Writer Medium

Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.

Ideogram Medium

We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.

See all platforms with this clause type →

Monitoring

ElevenLabs has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
We may use the information we collect, including voice recordings and voice models created through our platform, to develop, train, and improve our AI models and services. This may include using your voice data to enhance the quality and performance of our text-to-speech and voice cloning technologies.

— Excerpt from ElevenLabs's ElevenLabs Privacy Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision engages Illinois BIPA (which requires written consent before collecting biometric identifiers including voiceprints), Texas CUBI, Washington state biometric law, and GDPR Article 9 if voice data is processed as biometric data for the purpose of uniquely identifying a natural person. The FTC may also scrutinize this practice under its unfair or deceptive practices authority if consent mechanisms are found to be inadequate. The EU AI Act may impose additional obligations on providers using personal data to train general-purpose AI models. GOVERNANCE EXPOSURE: High. The use of voice recordings for AI model training without explicit, jurisdiction-specific consent mechanisms creates material legal exposure in Illinois, Texas, and Washington, where statutory damages for BIPA violations can reach $1,000-$5,000 per violation per person. GDPR also requires a clear and specific legal basis for processing biometric data, with consent being the most defensible basis in this context. JURISDICTION FLAGS: Illinois presents the highest exposure given BIPA's private right of action and statutory damages. Texas and Washington have analogous statutes enforced by state attorneys general. EU/EEA users are protected under GDPR Article 9, which treats biometric data as a special category requiring explicit consent. California's CPRA also addresses sensitive personal information including biometric data. CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers integrating ElevenLabs via API who submit end-user voice data should assess whether they have obtained adequate downstream consent from their own users. Data processing agreements should specify the purposes for which voice data may be used by ElevenLabs, and whether AI training constitutes a permitted secondary use. COMPLIANCE CONSIDERATIONS: Compliance teams should audit whether the consent obtained at account creation or at the point of voice submission satisfies written consent requirements under BIPA and equivalent statutes. If the current consent flow relies solely on acceptance of the privacy policy, this may be insufficient under BIPA. A data mapping exercise should identify all touchpoints where voice data is collected and the downstream processing purposes documented.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has authority over unfair or deceptive data practices, including inadequate disclosure or consent for biometric voice data collection and use for AI training purposes.
    File a complaint →
  • State AG
    State attorneys general in Illinois, Texas, Washington, and California have enforcement authority over biometric privacy and sensitive personal information statutes implicated by voice data collection.
    File a complaint →

Applicable regulations

EU AI Act
European Union
California AB 2013 AI Training Data Transparency
US-CA
Colorado AI Act
US-CO
EU AI Act - High Risk Provisions
EU
GDPR
European Union
Texas AI Act
Texas, USA
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
ElevenLabs Privacy Policy
Entity
ElevenLabs
Document last updated
May 5, 2026
Tracking information
First tracked
April 30, 2026
Last verified
May 10, 2026
Record ID
CA-P-009379
Document ID
CA-D-00450
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
b75b1f8acd13a68881f4fcb9606d10a24499d50e9b26f218570263ebce7417e9
Analysis generated
April 30, 2026 09:06 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: ElevenLabs
Document: ElevenLabs Privacy Policy
Record ID: CA-P-009379
Captured: 2026-04-30 09:06:47 UTC
SHA-256: b75b1f8acd13a688…
URL: https://conductatlas.com/platform/elevenlabs/elevenlabs-privacy-policy/voice-data-use-for-ai-model-training/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does ElevenLabs's Voice Data Use for AI Model Training clause do?

Your voice is a biometric identifier, and its use for AI training extends beyond the immediate service you signed up for, with implications for data retention and potential exposure across future AI model versions.

How does this clause affect you?

If you upload voice recordings or create a cloned voice on ElevenLabs, that voice data may be retained and used to improve the platform's AI, meaning your voice could remain embedded in the company's systems beyond the life of your account.

Is ConductAtlas affiliated with ElevenLabs?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by ElevenLabs.