If you create a deepfake or AI-manipulated video using Pika, you are required to disclose to viewers that the content was artificially generated or manipulated.
This analysis describes what Pika's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision assigns users the obligation to disclose AI-generated or manipulated outputs, which aligns with and incorporates by reference applicable state and federal deepfake disclosure laws, placing legal compliance responsibility on users rather than the platform.
Interpretive note: The specific form and context of disclosure required is not defined in the AUP itself and depends on applicable law, which varies by jurisdiction.
Users who generate AI video content through Pika, particularly deepfakes, are required under this AUP to label or disclose that content as artificially generated when sharing it; failure to do so may constitute an AUP violation and potentially a violation of applicable law.
Cross-platform context
See how other platforms handle Deepfake Disclosure Obligation and similar clauses.
Compare across platforms →Monitoring
Pika has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"You agree to disclose your use of the Service and obtain consent to use the Service as required by, and in accordance with, Laws. You agree to disclose that any deepfake Output has been artificially generated or manipulated.— Excerpt from Pika's Pika Acceptable Use Policy
REGULATORY LANDSCAPE: This provision incorporates by reference a growing body of state deepfake disclosure laws, including California AB 730, Texas SB 751, and Georgia SB 117, as well as proposed federal legislation. The FTC Act's prohibition on deceptive practices is also engaged. In the EU, the Digital Services Act and AI Act impose transparency requirements for AI-generated content. GOVERNANCE EXPOSURE: Medium. The provision places disclosure obligations squarely on users, which aligns with the broader sole responsibility framework of this AUP. The operationalization of this obligation, including what form of disclosure is required and in what context, is left to applicable law rather than defined by the AUP itself. JURISDICTION FLAGS: California, Texas, Georgia, Virginia, and other states with deepfake disclosure laws create specific compliance obligations. EU users are subject to AI Act transparency requirements for AI-generated content. The breadth of applicable laws means the specific disclosure format required may vary significantly by jurisdiction. CONTRACT AND VENDOR IMPLICATIONS: Enterprise users distributing AI-generated video content publicly should assess whether their distribution workflows incorporate adequate disclosure mechanisms to comply with jurisdiction-specific requirements. COMPLIANCE CONSIDERATIONS: Compliance teams should map applicable deepfake disclosure laws by jurisdiction for any commercial use of Pika-generated outputs. Content distribution policies should include AI-generation disclosure requirements as a standard element.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision assigns users the obligation to disclose AI-generated or manipulated outputs, which aligns with and incorporates by reference applicable state and federal deepfake disclosure laws, placing legal compliance responsibility on users rather than the platform.
Users who generate AI video content through Pika, particularly deepfakes, are required under this AUP to label or disclose that content as artificially generated when sharing it; failure to do so may constitute an AUP violation and potentially a violation of applicable law.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Pika.