ElevenLabs · ElevenLabs Usage Policy · View original document ↗

Voice Cloning Without Consent Prohibition

High severity Medium confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for ElevenLabs Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

You cannot use ElevenLabs to create a copy of a real person's voice unless that person has explicitly agreed to it first, whether they are famous or not.

This analysis describes what ElevenLabs's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This provision is central to the policy because voice cloning without consent is one of the primary potential misuse vectors of AI audio synthesis technology, and violations can result in account termination as well as potential legal liability for the user.

Interpretive note: The document does not specify the form or verification standard for 'explicit prior consent,' creating ambiguity about what consent documentation would satisfy this requirement in practice.

Consumer impact (what this means for users)

Any user who submits a third-party person's voice recordings to ElevenLabs for cloning without that person's documented consent is in violation of this provision and risks losing account access, in addition to potential civil or criminal liability under applicable deepfake or identity laws.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    If you have previously uploaded voice samples without the subject's consent, delete those voice models from your ElevenLabs account immediately and contact ElevenLabs support to confirm data deletion.

How other platforms handle this

X Medium

You may not access the Services in any way other than through the currently available, published interfaces that we provide. For example, this means that you cannot scrape the Services without X's express written permission, try to work around any technical limitations we impose, or otherwise attemp...

Runway Medium

You may not use Runway's tools to create content that promotes, glorifies, or facilitates acts of terrorism, mass violence, or genocide, or that could be used to provide material support to individuals or organizations engaged in such activities.

Mistral AI Medium

Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...

See all platforms with this clause type →

Monitoring

ElevenLabs has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
You may not use the Services to clone, replicate, or synthesize the voice of any real individual without that individual's explicit prior consent. This prohibition applies regardless of whether the voice is that of a public figure, celebrity, or private individual.

— Excerpt from ElevenLabs's ElevenLabs Usage Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

(1) REGULATORY LANDSCAPE: This provision engages GDPR Article 9 where voice data constitutes biometric data processed to uniquely identify a natural person, requiring explicit consent as a lawful basis for processing; the EU AI Act's transparency and consent requirements for AI-generated synthetic media; California AB 1836 and similar state statutes addressing unauthorized use of a person's voice or likeness; and the FTC Act's prohibition on unfair or deceptive commercial practices. The policy's prohibition is broadly consistent with these frameworks, though the absence of a specified consent verification mechanism may create tension with GDPR's accountability principle and the EU AI Act's forthcoming implementing regulations. (2) GOVERNANCE EXPOSURE: High. The voice cloning consent requirement creates direct legal exposure for both ElevenLabs and its API customers if consent is not obtained and documented prior to processing. Enterprise customers embedding the API in their own products bear independent responsibility under applicable data protection and identity laws for ensuring consent is obtained from voice data subjects. (3) JURISDICTION FLAGS: Heightened exposure exists in the EU/EEA under GDPR biometric data provisions, in California under AB 1836 and the California Celebrities Rights Act, in Illinois under BIPA if voice prints are treated as biometric identifiers, and in New York under its right of publicity statute. Several additional US states have enacted or are considering synthetic voice legislation. (4) CONTRACT AND VENDOR IMPLICATIONS: B2B customers integrating ElevenLabs' voice cloning API should assess whether their vendor agreements with ElevenLabs allocate responsibility for consent verification, and should implement their own consent collection workflows. The policy as written places the compliance obligation on the user, not ElevenLabs, which represents a liability shift that procurement teams should evaluate. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should implement documented consent workflows for any voice cloning use case, conduct data mapping to identify whether voice samples constitute biometric data under applicable law, and assess whether existing privacy notices adequately disclose voice data processing to data subjects.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive practices involving unauthorized synthetic voice generation used in commercial contexts.
    File a complaint →
  • State AG
    State attorneys general in California, Illinois, Texas, and Virginia have enforcement authority under applicable deepfake and biometric privacy statutes.
    File a complaint →

Applicable regulations

CFAA
United States Federal
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
ElevenLabs Usage Policy
Entity
ElevenLabs
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 11, 2026
Record ID
CA-P-010708
Document ID
CA-D-00779
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
3b04c061ee875cc733cfece1b436238b97a43b0e5ec22aaacc3176c33d57981a
Analysis generated
May 11, 2026 13:18 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: ElevenLabs
Document: ElevenLabs Usage Policy
Record ID: CA-P-010708
Captured: 2026-05-11 13:18:12 UTC
SHA-256: 3b04c061ee875cc7…
URL: https://conductatlas.com/platform/elevenlabs/elevenlabs-usage-policy/voice-cloning-without-consent-prohibition/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does ElevenLabs's Voice Cloning Without Consent Prohibition clause do?

This provision is central to the policy because voice cloning without consent is one of the primary potential misuse vectors of AI audio synthesis technology, and violations can result in account termination as well as potential legal liability for the user.

How does this clause affect you?

Any user who submits a third-party person's voice recordings to ElevenLabs for cloning without that person's documented consent is in violation of this provision and risks losing account access, in addition to potential civil or criminal liability under applicable deepfake or identity laws.

Is ConductAtlas affiliated with ElevenLabs?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by ElevenLabs.