Synthesia · Synthesia Terms of Service · View original document ↗

AI Avatar Consent and Misuse Prohibition

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Synthesia Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

You are not allowed to create AI videos using a real person's face, voice, or identity without getting their written permission first, and you cannot use the platform to make deceptive deepfake content.

This analysis describes what Synthesia's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This restriction places the legal and ethical burden of obtaining consent directly on the customer, and failure to comply constitutes a breach of agreement that can trigger immediate suspension and indemnification obligations.

Consumer impact (what this means for users)

Customers who create avatar content using real human likenesses without documented written consent are in breach of the agreement and potentially liable for all resulting legal claims, including those brought against Synthesia by affected individuals.

How other platforms handle this

Runway Medium

You may not use Runway's tools to create content that promotes, glorifies, or facilitates acts of terrorism, mass violence, or genocide, or that could be used to provide material support to individuals or organizations engaged in such activities.

Mistral AI Medium

Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...

Perplexity AI Medium

You may not use the Services to attempt to circumvent, disable, or otherwise interfere with safety-related features of the Services, including features that prevent or restrict the generation of certain types of content.

See all platforms with this clause type →

Monitoring

Synthesia has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
You must not use the Services to create content that: uses the likeness, voice, or personal data of any individual without their prior written consent; impersonates any person or entity in a misleading or deceptive manner; creates deepfakes or other synthetic media intended to deceive viewers about the identity or statements of real individuals.

— Excerpt from Synthesia's Synthesia Terms of Service

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision directly engages the EU AI Act, which imposes transparency requirements on AI-generated content including deepfakes, and requires disclosure to recipients that content is AI-generated. GDPR Article 9 is relevant where voice or facial data constituting biometric data is processed without explicit consent. In the US, Illinois BIPA, California's AB 602 and AB 730 (synthetic media statutes), and Texas CUBI create specific consent and disclosure requirements for biometric and synthetic media use. The FTC Act applies to deceptive use of synthetic media in commerce. GOVERNANCE EXPOSURE: High. The consent requirement for real-person likenesses and voices is legally significant given the expanding regulatory landscape for synthetic media. Customers producing content at scale involving real individuals (corporate communications, training videos with real employees) must implement robust consent management processes. Non-compliance creates both contractual breach and direct regulatory exposure. JURISDICTION FLAGS: Illinois BIPA creates a private right of action for biometric data collection without written consent, making it the highest-risk US jurisdiction for customers collecting voice or facial biometric data. EU customers must comply with GDPR explicit consent requirements for biometric data. Several US states have enacted or are enacting synthetic media and deepfake laws that independently require consent, regardless of Synthesia's contractual terms. CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers deploying Synthesia for internal communications, training, or marketing involving real employees or individuals should implement written consent documentation workflows. This is a standard due diligence requirement for any AI avatar deployment at scale and should be included in HR and content production policies. COMPLIANCE CONSIDERATIONS: Customers should develop and implement a consent management framework for all avatar content creation involving real human likenesses, including template consent forms, version control for consent records, and withdrawal processes. Legal teams should monitor synthetic media legislation in relevant operating jurisdictions as this regulatory area is evolving rapidly.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has authority over deceptive synthetic media practices and enforcement of consumer protection laws applicable to deepfake and AI-generated content used in commerce
    File a complaint →
  • State AG
    State Attorneys General in Illinois, California, Texas, and other states with synthetic media or biometric privacy statutes have enforcement authority over violations involving residents of those states
    File a complaint →

Applicable regulations

CFAA
United States Federal
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
Synthesia Terms of Service
Entity
Synthesia
Document last updated
May 5, 2026
Tracking information
First tracked
April 30, 2026
Last verified
May 10, 2026
Record ID
CA-P-008193
Document ID
CA-D-00471
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
c160c307398b191d34823085b7d2f7605405571da01ba21c03580602a3cc6c1d
Analysis generated
April 30, 2026 09:49 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Synthesia
Document: Synthesia Terms of Service
Record ID: CA-P-008193
Captured: 2026-04-30 09:49:49 UTC
SHA-256: c160c307398b191d…
URL: https://conductatlas.com/platform/synthesia/synthesia-terms-of-service/ai-avatar-consent-and-misuse-prohibition/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Synthesia's AI Avatar Consent and Misuse Prohibition clause do?

This restriction places the legal and ethical burden of obtaining consent directly on the customer, and failure to comply constitutes a breach of agreement that can trigger immediate suspension and indemnification obligations.

How does this clause affect you?

Customers who create avatar content using real human likenesses without documented written consent are in breach of the agreement and potentially liable for all resulting legal claims, including those brought against Synthesia by affected individuals.

Is ConductAtlas affiliated with Synthesia?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Synthesia.