Synthesia · Synthesia Terms of Service

Prohibition on Deepfakes and Deceptive Synthetic Media

High severity
Share 𝕏 Share in Share 🔒 PDF

What it is

You are prohibited from using Synthesia to make videos that impersonate real people without their permission, create fake intimate content, or deceive viewers about what is real.

Consumer impact (what this means for users)

Customers who use the platform to create deceptive or non-consensual synthetic media face immediate account termination and potential civil or criminal liability under emerging deepfake legislation in the UK, EU, and US states.

Cross-platform context

See how other platforms handle Prohibition on Deepfakes and Deceptive Synthetic Media and similar clauses.

Compare across platforms →
Need full compliance memos? See Professional →

Why it matters (compliance & risk perspective)

These prohibitions reflect emerging legal requirements around synthetic media and deepfakes, and violation could expose both Synthesia and customers to criminal liability or regulatory sanctions in multiple jurisdictions.

View original clause language
You may not use the Services to create content that: (a) depicts a real person without their consent; (b) constitutes non-consensual intimate imagery; (c) is designed to deceive viewers into believing it is authentic footage of a real person or event; or (d) is used to harass, defame, or harm any individual.

Institutional analysis (Compliance & legal intelligence)

REGULATORY FRAMEWORK: This provision directly engages the UK Online Safety Act 2023 (which criminalises non-consensual intimate deepfakes), the EU AI Act Articles 50 and 52 (transparency obligations for AI-generated synthetic media), and state deepfake laws in California (AB 602, AB 730), Texas (HB 4337), Virginia, and New York. GDPR Article 9 is implicated where synthetic media involves processing biometric data of identifiable individuals. The FTC Act Section 5 applies to deceptive AI-generated commercial content.

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Applicable agencies

  • FTC
    The FTC has authority over deceptive AI-generated content used in commercial contexts under Section 5 of the FTC Act, and has issued guidance on AI-generated media disclosures.
    File a complaint →
  • State AG
    State Attorneys General in California, Texas, Virginia, and New York have enforcement authority under deepfake-specific legislation and state consumer protection laws applicable to synthetic media misuse.
    File a complaint →

Provision details

Document information
Document
Synthesia Terms of Service
Entity
Synthesia
Document last updated
April 29, 2026
Tracking information
First tracked
April 30, 2026
Last verified
April 30, 2026
Record ID
CA-P-004393
Document ID
CA-D-00471
Evidence Provenance
Source URL
Wayback Machine
SHA-256
c160c307398b191d34823085b7d2f7605405571da01ba21c03580602a3cc6c1d
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: Synthesia | Document: Synthesia Terms of Service | Record: CA-P-004393
Captured: 2026-04-30 09:49:49 UTC | SHA-256: c160c307398b191d…
URL: https://conductatlas.com/platform/synthesia/synthesia-terms-of-service/prohibition-on-deepfakes-and-deceptive-synthetic-media/
Accessed: May 2, 2026
Classification
Severity
High
Categories

Other provisions in this document