You cannot use Runway's AI tools to help build surveillance systems that track people without their knowledge or consent, or that target protected groups.
This analysis describes what Runway's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision is operationally significant for enterprise users who might otherwise seek to integrate AI-generated visual or analytical content into surveillance, security, or monitoring systems, and engages multiple regulatory frameworks governing biometric and location data.
Interpretive note: The scope of 'mass surveillance' and 'unlawful monitoring' may require interpretation in specific enterprise deployment contexts, and the line between permitted security applications and prohibited surveillance systems may not be clearly defined by the policy text alone.
The terms prohibit using Runway's tools to build mass surveillance or non-consensual individual tracking systems, which establishes a boundary relevant to enterprise customers in security, law enforcement technology, and monitoring sectors.
How other platforms handle this
Do not generate images for political campaigns or to try to influence the outcome of an election. Do not generate images to spread misinformation or disinformation. Do not generate images to attempt to or to actually deceive or defraud anyone. Do not intentionally mislead recipients of generated ima...
All content on this Internet site ("the delta.com website") is owned or controlled by Delta Air Lines and is protected by worldwide copyright laws.
No Caching. Customer will not cache or store any Content (including any map images or geographic data) for use in the Maps API(s), except as expressly authorized under this Agreement. Customer may implement temporary caching with a cache period of no more than 30 calendar days to improve the perform...
Monitoring
Runway has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"You may not use Runway's tools to build or support systems designed for mass surveillance, tracking of individuals without their consent, or the unlawful monitoring of protected groups or activities.— Excerpt from Runway's Runway Usage Policy
REGULATORY LANDSCAPE: This provision engages the EU AI Act, which categorizes real-time remote biometric identification systems in public spaces as prohibited AI applications and classifies certain biometric monitoring systems as high-risk. The GDPR imposes strict requirements on biometric data processing. In the US, the FTC Act and state biometric privacy laws (Illinois BIPA, Texas CUBI, Washington My Health MY Data Act) are engaged. The provision's reference to 'protected groups' engages Title VII and equivalent anti-discrimination frameworks. GOVERNANCE EXPOSURE: High, for enterprise users in security, law enforcement technology, and monitoring sectors. The EU AI Act's prohibition on real-time biometric identification and mass surveillance AI applications creates direct regulatory conflict for any enterprise attempting to use Runway in such systems. Illinois BIPA creates significant litigation risk for biometric data processing without consent. JURISDICTION FLAGS: EU (EU AI Act prohibitions on biometric surveillance are directly applicable). Illinois (BIPA private right of action creates heightened litigation exposure). Texas, Washington (state biometric privacy statutes). Law enforcement technology contexts in the US require evaluation against Fourth Amendment constraints and emerging federal AI policing guidance. CONTRACT AND VENDOR IMPLICATIONS: Enterprise procurement teams in government, law enforcement technology, and security sectors must assess whether their intended use of Runway engages this prohibition. B2B contracts should include representations about intended use cases to avoid downstream liability for surveillance-related applications. COMPLIANCE CONSIDERATIONS: Legal teams at security technology companies should assess whether integration of Runway's tools into existing monitoring or analytics platforms triggers this prohibition. EU-based enterprises should assess EU AI Act compliance for any biometric monitoring or surveillance-adjacent applications. Data protection impact assessments (DPIAs) under the GDPR should be conducted for any application involving biometric or location data processing.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision is operationally significant for enterprise users who might otherwise seek to integrate AI-generated visual or analytical content into surveillance, security, or monitoring systems, and engages multiple regulatory frameworks governing biometric and location data.
The terms prohibit using Runway's tools to build mass surveillance or non-consensual individual tracking systems, which establishes a boundary relevant to enterprise customers in security, law enforcement technology, and monitoring sectors.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Runway.