Perplexity AI · Perplexity Acceptable Use Policy · View original document ↗

Prohibition on Child Sexual Abuse Material (CSAM)

High severity High confidence Explicitdocumentlanguage Rare · 1 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Perplexity AI Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Generating, sharing, or enabling content that sexually exploits children is strictly prohibited and likely illegal under applicable law.

This analysis describes what Perplexity AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This is an absolute prohibition aligned with US and international law; violations may expose users to criminal liability beyond account termination.

Consumer impact (what this means for users)

Any user who generates or facilitates CSAM through the platform violates this policy and faces account action, as well as potential criminal prosecution under applicable federal and international law.

How other platforms handle this

Amazon Medium

You may not use the Services to: violate the security or integrity of any network, computer or communications system, software application, or network or computing device; access or use any system without permission, including attempting to probe, scan, or test the vulnerability of a system or to br...

Runway Medium

You may not use Runway's tools to create content that promotes, glorifies, or facilitates acts of terrorism, mass violence, or genocide, or that could be used to provide material support to individuals or organizations engaged in such activities.

Mistral AI Medium

Customer will not, and will not permit any other person (including any End User) to: ... (d) attempt to reverse engineer, decompile, or otherwise attempt to discover the source code or underlying components (e.g., algorithms, weights, or systems) of the Mistral AI Products, including using the Outpu...

See all platforms with this clause type →

Monitoring

Perplexity AI has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
You may not use the Services to generate, distribute, or facilitate child sexual abuse material (CSAM) or any content that sexually exploits or harms minors.

— Excerpt from Perplexity AI's Perplexity Acceptable Use Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision directly implicates US federal law including the PROTECT Act and 18 U.S.C. 2256 et seq., enforced by DOJ and with mandatory reporting obligations to NCMEC under federal law. Internationally, similar prohibitions are enforced under national criminal codes and EU Directive 2011/93/EU. The FTC does not have primary jurisdiction here; criminal enforcement applies. GOVERNANCE EXPOSURE: High. Platforms that generate AI content must implement technical safeguards to prevent CSAM generation. Failure to do so creates criminal and civil liability exposure for the platform, not just the user. Compliance teams should ensure content moderation systems are audited against this requirement. JURISDICTION FLAGS: This prohibition applies globally and is consistent with law in virtually all jurisdictions. However, mandatory reporting obligations vary; US-based platforms have specific NCMEC reporting requirements under federal law. CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers deploying Perplexity in environments accessible to minors should assess whether additional safeguards are in place beyond the AUP prohibition, and whether contractual representations from Perplexity regarding content moderation are adequate. COMPLIANCE CONSIDERATIONS: Compliance teams should verify that Perplexity's technical implementation includes classifiers or filters sufficient to prevent CSAM generation, and should review Perplexity's incident response procedures for NCMEC reporting compliance.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over platform practices related to child safety online and may take action under COPPA or unfair practices authority if platform safeguards are inadequate.
    File a complaint →

Applicable regulations

CFAA
United States Federal
DMCA
United States Federal
DSA
European Union
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
Perplexity Acceptable Use Policy
Entity
Perplexity AI
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 11, 2026
Record ID
CA-P-010544
Document ID
CA-D-00760
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
6d664bd3ce2e23b73f26f6644d636b1fb81e00cce440e455edc0bbedcc549ceb
Analysis generated
May 11, 2026 11:44 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Perplexity AI
Document: Perplexity Acceptable Use Policy
Record ID: CA-P-010544
Captured: 2026-05-11 11:44:15 UTC
SHA-256: 6d664bd3ce2e23b7…
URL: https://conductatlas.com/platform/perplexity-ai/perplexity-acceptable-use-policy/prohibition-on-child-sexual-abuse-material-csam/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Perplexity AI's Prohibition on Child Sexual Abuse Material (CSAM) clause do?

This is an absolute prohibition aligned with US and international law; violations may expose users to criminal liability beyond account termination.

How does this clause affect you?

Any user who generates or facilitates CSAM through the platform violates this policy and faces account action, as well as potential criminal prosecution under applicable federal and international law.

How many platforms have this type of clause?

ConductAtlas has identified this type of provision across 1 platforms. See the full comparison.

Is ConductAtlas affiliated with Perplexity AI?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Perplexity AI.