You are prohibited from using Synthesia to make videos that impersonate real people without their permission, create fake intimate content, or deceive viewers about what is real.
Customers who use the platform to create deceptive or non-consensual synthetic media face immediate account termination and potential civil or criminal liability under emerging deepfake legislation in the UK, EU, and US states.
Cross-platform context
See how other platforms handle Prohibition on Deepfakes and Deceptive Synthetic Media and similar clauses.
Compare across platforms →These prohibitions reflect emerging legal requirements around synthetic media and deepfakes, and violation could expose both Synthesia and customers to criminal liability or regulatory sanctions in multiple jurisdictions.
REGULATORY FRAMEWORK: This provision directly engages the UK Online Safety Act 2023 (which criminalises non-consensual intimate deepfakes), the EU AI Act Articles 50 and 52 (transparency obligations for AI-generated synthetic media), and state deepfake laws in California (AB 602, AB 730), Texas (HB 4337), Virginia, and New York. GDPR Article 9 is implicated where synthetic media involves processing biometric data of identifiable individuals. The FTC Act Section 5 applies to deceptive AI-generated commercial content.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.