Runway · Runway Usage Policy · View original document ↗

Prohibition on Non-Consensual Intimate Imagery

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Runway Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

You cannot use Runway to create fake sexual images or videos of real people without their consent, including AI-generated intimate content that uses a real person's likeness.

This analysis describes what Runway's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

Non-consensual intimate imagery causes severe harm to depicted individuals and is addressed by an increasing number of state and federal statutes; this provision establishes Runway's policy alignment with legal prohibitions on AI-generated NCII.

Consumer impact (what this means for users)

The terms prohibit generating AI-based non-consensual intimate imagery of real individuals, which is a category of content increasingly prohibited by state law and subject to federal legislative proposals; users who generate such content may face both platform termination and independent legal liability.

How other platforms handle this

Mistral AI Medium

Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...

Perplexity AI Medium

You may not use the Services to attempt to circumvent, disable, or otherwise interfere with safety-related features of the Services, including features that prevent or restrict the generation of certain types of content.

AI21 Labs Medium

You may not use the Services, including any outputs, to develop, train, fine-tune, or improve any machine learning model or artificial intelligence system that competes with AI21's products or services.

See all platforms with this clause type →

Monitoring

Runway has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
You may not use Runway's tools to create non-consensual intimate imagery (NCII), including AI-generated sexual content depicting real individuals without their explicit consent.

— Excerpt from Runway's Runway Usage Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision engages state NCII statutes enacted in approximately 48 US states. The federal DEFIANCE Act (enacted 2024) creates a civil cause of action for victims of AI-generated NCII. The EU's Digital Services Act and General Data Protection Regulation engage with unauthorized processing of biometric and intimate imagery data. California's AB 602 specifically addresses AI-generated NCII. GOVERNANCE EXPOSURE: High. AI-generated NCII represents one of the most rapidly expanding areas of digital harm legislation. Platform operators who permit or fail to detect NCII generation may face civil liability under the DEFIANCE Act and state statutes, as well as regulatory scrutiny. The operational challenge of detecting AI-generated NCII before distribution is significant. JURISDICTION FLAGS: US federal law (DEFIANCE Act, 2024), California (AB 602), and approximately 48 state NCII statutes create broad geographic coverage. EU member states have implemented GDPR-based frameworks and national criminal laws addressing non-consensual intimate imagery. UK (Online Safety Act) imposes specific NCII obligations on platforms. CONTRACT AND VENDOR IMPLICATIONS: Enterprise platforms built on Runway must implement content moderation and detection protocols sufficient to prevent NCII generation. Indemnification clauses in enterprise agreements should address allocation of liability for NCII-related claims arising from user-generated content on Runway-based platforms. COMPLIANCE CONSIDERATIONS: Compliance teams should assess NCII detection and reporting capabilities in any Runway-based platform deployment. Legal teams should review obligations under the DEFIANCE Act and applicable state statutes for mandatory reporting or takedown procedures. Platforms serving EU users should assess GDPR obligations related to biometric data processing implicated by AI-generated intimate imagery.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has authority over deceptive and unfair practices in digital services and has engaged with non-consensual intimate imagery as a consumer harm area.
    File a complaint →
  • State AG
    State attorneys general have enforcement authority under state NCII statutes in approximately 48 US states.
    File a complaint →

Applicable regulations

CFAA
United States Federal
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
Runway Usage Policy
Entity
Runway
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 11, 2026
Record ID
CA-P-010704
Document ID
CA-D-00773
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
d90a4f3400a54d7669e1b9b15a5d0ba7bd004f5b9d282b11d7d85314456abb41
Analysis generated
May 11, 2026 22:34 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Runway
Document: Runway Usage Policy
Record ID: CA-P-010704
Captured: 2026-05-11 22:34:16 UTC
SHA-256: d90a4f3400a54d76…
URL: https://conductatlas.com/platform/runway/runway-usage-policy/prohibition-on-non-consensual-intimate-imagery/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Runway's Prohibition on Non-Consensual Intimate Imagery clause do?

Non-consensual intimate imagery causes severe harm to depicted individuals and is addressed by an increasing number of state and federal statutes; this provision establishes Runway's policy alignment with legal prohibitions on AI-generated NCII.

How does this clause affect you?

The terms prohibit generating AI-based non-consensual intimate imagery of real individuals, which is a category of content increasingly prohibited by state law and subject to federal legislative proposals; users who generate such content may face both platform termination and independent legal liability.

Is ConductAtlas affiliated with Runway?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Runway.