If you use ElevenLabs to create an AI copy of someone's voice, you must first get that person's clear permission. Doing it without permission can get your account suspended and may be reported to law enforcement.
This analysis describes what ElevenLabs's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision places the legal and operational responsibility for obtaining third-party consent entirely on the user, not on ElevenLabs, which means individuals and businesses using voice cloning features bear the compliance risk if consent is inadequate or undocumented.
Interpretive note: The policy does not specify the verification mechanism ElevenLabs uses to confirm consent has been obtained, leaving the operational implementation of this requirement uncertain.
The policy requires users to obtain explicit consent before cloning any person's voice; individuals whose voices are cloned without consent may report the violation to ElevenLabs and the policy states the company may suspend the offending account and refer the matter to authorities.
Cross-platform context
See how other platforms handle Voice Cloning Consent Requirement and similar clauses.
Compare across platforms →Monitoring
ElevenLabs has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"You must obtain explicit consent from any individual whose voice you clone using our platform. Cloning a person's voice without their permission is a violation of this policy and may result in immediate account suspension and referral to relevant authorities.— Excerpt from ElevenLabs's ElevenLabs Safety Policy
REGULATORY LANDSCAPE: This provision engages GDPR Article 9, which classifies biometric data used for identification (including voice prints) as a special category requiring explicit consent for processing in EU/EEA contexts; the Illinois Biometric Information Privacy Act (BIPA), which requires written consent and a retention/destruction policy for biometric identifiers including voiceprints; and the California Personality Rights Act addressing commercial use of a person's voice or likeness. The FTC Act applies where unauthorized voice cloning is used in commerce or fraud. Relevant enforcement authorities include EU national data protection supervisory authorities, the Illinois Attorney General, the California Attorney General, and the FTC. GOVERNANCE EXPOSURE: High. The provision places affirmative consent obligations on users but does not specify the mechanism by which ElevenLabs verifies or audits that consent has been obtained, creating a gap between policy assertion and operational implementation. Enterprise customers deploying voice cloning at scale without a documented consent management process face direct exposure under BIPA and GDPR Article 9. JURISDICTION FLAGS: Illinois presents the highest exposure due to BIPA's private right of action and statutory damages of $1,000 to $5,000 per violation. EU/EEA deployments require demonstrable explicit consent under GDPR Article 9, and the adequacy of a user attestation model (without verification) may be questioned by supervisory authorities. California's AB 2839 and personality rights statutes create additional exposure for commercial or political use cases. CONTRACT AND VENDOR IMPLICATIONS: Procurement teams should confirm whether ElevenLabs' enterprise agreements include data processing addenda (DPAs) that address voice biometric data as a special category under GDPR. The policy's placement of consent obligations on users, rather than on ElevenLabs, represents a liability allocation that enterprise customers should evaluate in the context of their own compliance posture. B2B contracts should specify consent documentation standards and audit rights. COMPLIANCE CONSIDERATIONS: Organizations using ElevenLabs' voice cloning features should implement a documented consent workflow that captures the date, method, and scope of consent from each voice subject. Data mapping exercises should classify voice clone data as biometric under BIPA and as special category data under GDPR. Legal teams should assess whether existing privacy notices and consent forms cover AI voice cloning as a specific processing purpose.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision places the legal and operational responsibility for obtaining third-party consent entirely on the user, not on ElevenLabs, which means individuals and businesses using voice cloning features bear the compliance risk if consent is inadequate or undocumented.
The policy requires users to obtain explicit consent before cloning any person's voice; individuals whose voices are cloned without consent may report the violation to ElevenLabs and the policy states the company may suspend the offending account and refer the matter to authorities.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by ElevenLabs.