Runway · Runway Usage Policy · View original document ↗

Prohibition on Real Person Likeness Without Permission

Medium severity Medium confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Runway Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Runway prohibits using any image, video, or audio of a real person as input or in generated content without that person's permission.

This analysis describes what Runway's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This provision has broad practical implications for users who work with real-world footage or audio, including in commercial, journalistic, or creative contexts, as it requires affirmative consent from the depicted individual.

Interpretive note: The policy does not define 'permission' or specify whether implied, contractual, or statutory consent suffices; the absence of fair use or public interest carve-outs may create tension with applicable law in some jurisdictions.

Consumer impact (what this means for users)

Users who generate content incorporating the likeness, voice, or image of real people without documented consent risk account suspension; this includes use cases such as AI-generated video of public figures, voice cloning, and deepfake-style content for any purpose.

How other platforms handle this

X Medium

You may not access the Services in any way other than through the currently available, published interfaces that we provide. For example, this means that you cannot scrape the Services without X's express written permission, try to work around any technical limitations we impose, or otherwise attemp...

Mistral AI Medium

Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...

Perplexity AI Medium

You may not use the Services to attempt to circumvent, disable, or otherwise interfere with safety-related features of the Services, including features that prevent or restrict the generation of certain types of content.

See all platforms with this clause type →

Monitoring

Runway has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Use of an image, video, or audio of another person without their permission

— Excerpt from Runway's Runway Usage Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

(1) REGULATORY LANDSCAPE: Right-of-publicity laws vary significantly by U.S. state, with California's Civil Code Section 3344 and equivalent statutes providing robust protections for commercial use of likeness. The EU's GDPR treats biometric and identifying data with heightened protection. The EU AI Act includes specific provisions on biometric identification systems. Emerging U.S. federal proposals addressing AI and likeness rights are relevant context. (2) GOVERNANCE EXPOSURE: Medium to High. The scope of this prohibition is broad and may restrict legitimate journalistic, documentary, historical, or parody use cases that may be protected under applicable law. The policy does not carve out fair use, public interest, or journalistic exceptions, which are recognized under U.S. and EU law. (3) JURISDICTION FLAGS: California, New York, and Illinois have strong right-of-publicity statutes. In the EU, GDPR's biometric data provisions and national personality rights laws create additional compliance layers. The policy's prohibition on audio of another person without permission may engage voice cloning regulations that are emerging in multiple jurisdictions. (4) CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers in media, advertising, and entertainment should assess whether their content workflows can demonstrate consent for all real-person likenesses and audio used with Runway's tools. B2B contracts should address how consent documentation is maintained and what liability flows if a third-party submitter cannot demonstrate permission. (5) COMPLIANCE CONSIDERATIONS: Legal teams should assess whether the policy's consent requirement aligns with the company's own practices regarding training data and model development. Compliance programs should include documented consent verification workflows for enterprise use cases involving real-person content.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over deceptive or unfair practices involving AI-generated likeness content and impersonation harms
    File a complaint →
  • State AG
    State attorneys general in jurisdictions with right-of-publicity or biometric privacy statutes (such as California, New York, and Illinois) have enforcement authority over unauthorized likeness use
    File a complaint →

Applicable regulations

CFAA
United States Federal
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
Runway Usage Policy
Entity
Runway
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 11, 2026
Record ID
CA-P-010705
Document ID
CA-D-00773
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
c27b1ea06341626cc8144e29bf34df0e3b48b2db15a10460062e01e81488dc94
Analysis generated
May 11, 2026 13:13 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Runway
Document: Runway Usage Policy
Record ID: CA-P-010705
Captured: 2026-05-11 13:13:43 UTC
SHA-256: c27b1ea06341626c…
URL: https://conductatlas.com/platform/runway/runway-usage-policy/prohibition-on-real-person-likeness-without-permission/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
Medium
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Runway's Prohibition on Real Person Likeness Without Permission clause do?

This provision has broad practical implications for users who work with real-world footage or audio, including in commercial, journalistic, or creative contexts, as it requires affirmative consent from the depicted individual.

How does this clause affect you?

Users who generate content incorporating the likeness, voice, or image of real people without documented consent risk account suspension; this includes use cases such as AI-generated video of public figures, voice cloning, and deepfake-style content for any purpose.

Is ConductAtlas affiliated with Runway?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Runway.