You cannot use ElevenLabs to create a copy of a real person's voice unless that person has explicitly agreed to it first, whether they are famous or not.
This analysis describes what ElevenLabs's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision is central to the policy because voice cloning without consent is one of the primary potential misuse vectors of AI audio synthesis technology, and violations can result in account termination as well as potential legal liability for the user.
Interpretive note: The document does not specify the form or verification standard for 'explicit prior consent,' creating ambiguity about what consent documentation would satisfy this requirement in practice.
Any user who submits a third-party person's voice recordings to ElevenLabs for cloning without that person's documented consent is in violation of this provision and risks losing account access, in addition to potential civil or criminal liability under applicable deepfake or identity laws.
How other platforms handle this
You may not access the Services in any way other than through the currently available, published interfaces that we provide. For example, this means that you cannot scrape the Services without X's express written permission, try to work around any technical limitations we impose, or otherwise attemp...
You may not use Runway's tools to create content that promotes, glorifies, or facilitates acts of terrorism, mass violence, or genocide, or that could be used to provide material support to individuals or organizations engaged in such activities.
Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...
Monitoring
ElevenLabs has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"You may not use the Services to clone, replicate, or synthesize the voice of any real individual without that individual's explicit prior consent. This prohibition applies regardless of whether the voice is that of a public figure, celebrity, or private individual.— Excerpt from ElevenLabs's ElevenLabs Usage Policy
(1) REGULATORY LANDSCAPE: This provision engages GDPR Article 9 where voice data constitutes biometric data processed to uniquely identify a natural person, requiring explicit consent as a lawful basis for processing; the EU AI Act's transparency and consent requirements for AI-generated synthetic media; California AB 1836 and similar state statutes addressing unauthorized use of a person's voice or likeness; and the FTC Act's prohibition on unfair or deceptive commercial practices. The policy's prohibition is broadly consistent with these frameworks, though the absence of a specified consent verification mechanism may create tension with GDPR's accountability principle and the EU AI Act's forthcoming implementing regulations. (2) GOVERNANCE EXPOSURE: High. The voice cloning consent requirement creates direct legal exposure for both ElevenLabs and its API customers if consent is not obtained and documented prior to processing. Enterprise customers embedding the API in their own products bear independent responsibility under applicable data protection and identity laws for ensuring consent is obtained from voice data subjects. (3) JURISDICTION FLAGS: Heightened exposure exists in the EU/EEA under GDPR biometric data provisions, in California under AB 1836 and the California Celebrities Rights Act, in Illinois under BIPA if voice prints are treated as biometric identifiers, and in New York under its right of publicity statute. Several additional US states have enacted or are considering synthetic voice legislation. (4) CONTRACT AND VENDOR IMPLICATIONS: B2B customers integrating ElevenLabs' voice cloning API should assess whether their vendor agreements with ElevenLabs allocate responsibility for consent verification, and should implement their own consent collection workflows. The policy as written places the compliance obligation on the user, not ElevenLabs, which represents a liability shift that procurement teams should evaluate. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should implement documented consent workflows for any voice cloning use case, conduct data mapping to identify whether voice samples constitute biometric data under applicable law, and assess whether existing privacy notices adequately disclose voice data processing to data subjects.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision is central to the policy because voice cloning without consent is one of the primary potential misuse vectors of AI audio synthesis technology, and violations can result in account termination as well as potential legal liability for the user.
Any user who submits a third-party person's voice recordings to ElevenLabs for cloning without that person's documented consent is in violation of this provision and risks losing account access, in addition to potential civil or criminal liability under applicable deepfake or identity laws.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by ElevenLabs.