You cannot use ElevenLabs to create sexual or intimate audio content involving a real person unless they have explicitly consented to it.
This analysis describes what ElevenLabs's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Non-consensual intimate synthetic audio (sometimes called audio NCII) causes serious harm to victims and is increasingly subject to criminal penalties in multiple jurisdictions; this provision signals ElevenLabs' policy position on this category of misuse.
Users who generate AI voice content of a sexual or intimate nature involving real, identifiable individuals without their consent violate this provision and face account termination, in addition to potential criminal liability under applicable NCII laws.
How other platforms handle this
You may not use Runway's tools to create content that promotes, glorifies, or facilitates acts of terrorism, mass violence, or genocide, or that could be used to provide material support to individuals or organizations engaged in such activities.
Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...
You may not use the Services to attempt to circumvent, disable, or otherwise interfere with safety-related features of the Services, including features that prevent or restrict the generation of certain types of content.
Monitoring
ElevenLabs has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"You may not use the Services to generate audio content that is sexually explicit or of an intimate nature featuring any real individual without that individual's explicit consent, including content designed to simulate or suggest intimate scenarios involving identifiable persons.— Excerpt from ElevenLabs's ElevenLabs Usage Policy
(1) REGULATORY LANDSCAPE: This provision engages the UK Online Safety Act's provisions on intimate image abuse, which include synthetic content; US federal legislation such as the DEFIANCE Act (if enacted and applicable); and state-level NCII statutes in California, Texas, Georgia, and over 40 other states that have extended or are extending coverage to AI-generated intimate content. The FTC's authority over deceptive and harmful commercial practices is also potentially relevant where such content is generated via a commercial platform. (2) GOVERNANCE EXPOSURE: High. The generation of non-consensual intimate synthetic audio exposes both the platform and the user to significant legal liability. ElevenLabs' prohibition is consistent with regulatory direction across multiple jurisdictions, but the policy does not specify a detection or prevention mechanism, leaving enforcement reactive rather than proactive. (3) JURISDICTION FLAGS: Heightened exposure in the UK under the Online Safety Act, in California under SB 926, in Texas under HB 4337, and under any applicable federal legislation. The EU's GDPR and AI Act also engage where EU residents are the subjects of such content. (4) CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers should ensure their downstream use cases, particularly consumer-facing products, include technical and policy controls that prevent users from generating NCII via ElevenLabs-powered features. The AUP places enforcement responsibility on users, but platform liability may arise under applicable hosting and intermediary liability frameworks. (5) COMPLIANCE CONSIDERATIONS: Platform compliance teams should assess whether existing content moderation systems are capable of detecting attempts to generate intimate content involving real individuals, and should consider whether additional safeguards or reporting mechanisms are warranted in light of emerging regulatory requirements.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Non-consensual intimate synthetic audio (sometimes called audio NCII) causes serious harm to victims and is increasingly subject to criminal penalties in multiple jurisdictions; this provision signals ElevenLabs' policy position on this category of misuse.
Users who generate AI voice content of a sexual or intimate nature involving real, identifiable individuals without their consent violate this provision and face account termination, in addition to potential criminal liability under applicable NCII laws.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by ElevenLabs.