Runway prohibits any content involving child sexual abuse, automatically reports it to NCMEC and global law enforcement, and permanently bans all accounts associated with such content.
This analysis describes what Runway's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision establishes Runway's most serious enforcement mechanism, an indefinite account suspension with no described appeal path, and triggers mandatory law enforcement referral for CSAM violations.
Any account associated with CSAM content will be indefinitely suspended and reported to NCMEC and law enforcement; this is the only violation in the policy that specifies indefinite (rather than reviewable) suspension.
How other platforms handle this
Lime reserves the right to (a) modify or discontinue, temporarily or permanently, the Services (or any part thereof); (b) refuse any user access to the Services for any reason, including if Lime believes that user has violated this Agreement; at any time and without notice or liability to you or to ...
Twilio may, without notice, suspend or terminate Customer's account and access to the Services if Customer violates this Agreement, including the Acceptable Use Policy, or if Twilio reasonably believes that Customer's use of the Services is causing harm to Twilio, its network, or third parties.
After receiving and reviewing a report, our Team will take action on the Content where appropriate. These actions may include, but are not limited to: Asking the relevant User for collaboration or modifications to the Content; Unranking the Content; Adding a Not for All Audiences (NFAA) Tag; Removin...
Monitoring
Runway has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Content that depicts, facilitates, or promotes child sexual abuse or the sexualization of children. We report any child sexual abuse material (CSAM) that we become aware of to the National Center for Missing and Exploited Children (NCMEC), which works with global law enforcement agencies around the world, and we indefinitely suspend all associated accounts.— Excerpt from Runway's Runway Usage Policy
(1) REGULATORY LANDSCAPE: This provision aligns with U.S. federal obligations under 18 U.S.C. 2258A, which requires electronic service providers to report apparent CSAM to NCMEC. The FTC and DOJ have jurisdiction over platforms that fail to meet these obligations. The EU's Digital Services Act and applicable child protection directives impose analogous mandatory reporting obligations in EU jurisdictions. (2) GOVERNANCE EXPOSURE: High. Mandatory CSAM reporting with indefinite account suspension is a legally required and high-stakes enforcement mechanism. Failure to implement effective detection and reporting processes creates substantial legal and reputational exposure. The policy's reference to both automated systems and human review is relevant to demonstrating reasonable compliance posture. (3) JURISDICTION FLAGS: Mandatory CSAM reporting obligations apply in the U.S. under federal law and in EU member states under national implementations of EU child protection directives. Multinational deployments should confirm that Runway's NCMEC reporting pipeline satisfies local legal requirements in all operating jurisdictions, as some jurisdictions may require additional or parallel reporting to domestic authorities. (4) CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers should confirm in vendor agreements that Runway's detection and reporting processes meet applicable legal standards, and that the scope of 'indefinite suspension' is clearly defined to avoid ambiguity in B2B service continuity provisions. The policy does not describe an appeal mechanism for CSAM-related suspensions, which may create due process considerations in jurisdictions with platform liability frameworks. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should confirm that Runway's automated and human review processes are documented and auditable, and that internal escalation paths for CSAM detection are clearly defined. Enterprise agreements should address how CSAM-related account suspensions affecting shared organizational accounts will be handled operationally.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision establishes Runway's most serious enforcement mechanism, an indefinite account suspension with no described appeal path, and triggers mandatory law enforcement referral for CSAM violations.
Any account associated with CSAM content will be indefinitely suspended and reported to NCMEC and law enforcement; this is the only violation in the policy that specifies indefinite (rather than reviewable) suspension.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Runway.