Perplexity AI · Perplexity Enterprise Terms · View original document ↗

De-Identified Usage Data for Model Improvement

Medium severity Low confidence Inferredfromcontext Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Perplexity AI Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Perplexity can use anonymized data about how your employees use the service to improve its AI systems, as long as that data cannot be traced back to your company or any individual.

This analysis describes what Perplexity AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

While the data is described as de-identified and aggregated, enterprise customers should understand that usage patterns from their organization may contribute to Perplexity's AI model training and product development. The practical effectiveness of de-identification depends on implementation, which the agreement does not detail.

Interpretive note: The exact language and scope of the de-identification provision was not directly extractable from the truncated HTML. The provision described reflects common structures in enterprise AI SaaS agreements; the actual terms may include more specific standards or opt-out mechanisms not captured here.

Consumer impact (what this means for users)

Enterprise employee usage data, once de-identified, may be used by Perplexity to improve its AI models under this provision. The agreement asserts this data cannot identify Customer or individual users, but the specific de-identification standards and technical measures are not described in the terms.

How other platforms handle this

Groq Medium

We may de-identify, anonymize, or aggregate information we collect so the information cannot reasonably identify you or your device, or we may collect information that is already in de-identified form. For example, we may disclose performance benchmark data and other aggregated, anonymized, or de-id...

TurboTax Medium

We use your personal information to personalize your experience with our products and services, improve and develop new features and products, conduct research and analytics, and to send you communications about products and services that may interest you.

Walgreens Medium

We may use and share de-identified or aggregated information for any purpose, including research and analytics. We maintain and use de-identified data without attempting to re-identify it.

See all platforms with this clause type →

Monitoring

Perplexity AI has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Perplexity may collect and use aggregated and de-identified data derived from Customer's and Authorized Users' use of the Service for purposes of improving, developing, and enhancing the Service and Perplexity's AI models, provided that such data does not identify Customer or any individual user.

— Excerpt from Perplexity AI's Perplexity Enterprise Terms

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

(1) REGULATORY LANDSCAPE: This provision engages GDPR's requirements for lawful processing of personal data and the standards for effective anonymization (which GDPR guidance establishes must be irreversible). Under GDPR, data that is merely pseudonymized rather than truly anonymized remains subject to data protection obligations. CCPA's definition of de-identified data and associated obligations for businesses using de-identified data are also relevant. The EU AI Act may impose additional transparency requirements regarding training data sourcing. (2) GOVERNANCE EXPOSURE: Medium. The provision is common in SaaS AI agreements but the absence of specific de-identification standards in the visible document text creates compliance uncertainty for enterprise customers subject to GDPR. If de-identification is not sufficiently robust, this provision could constitute processing of personal data without an adequate legal basis. (3) JURISDICTION FLAGS: EU and EEA enterprise customers face the highest exposure, as GDPR's anonymization standards are more stringent than US standards and the legal basis for processing employee data for AI model improvement is not straightforward under Article 6. California enterprise customers should evaluate whether the provision satisfies CCPA's de-identification requirements. (4) CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers should request specification of the de-identification methodology in the data processing addendum or a separate technical annex. Procurement teams should confirm whether this usage constitutes Perplexity acting as a data controller for model improvement purposes, which would affect the data processing agreement structure. (5) COMPLIANCE CONSIDERATIONS: Data protection officers at enterprise customers should conduct a data protection impact assessment for this use case, particularly for EU deployments. Internal data governance policies should inform employees that query data may be used in de-identified form for AI model improvement.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    FTC has authority over data practices and may scrutinize whether de-identification claims in AI training data contexts meet adequate standards under consumer protection frameworks
    File a complaint →

Applicable regulations

EU AI Act
European Union
CCPA/CPRA
California, USA
Colorado AI Act
US-CO
CAN-SPAM
United States Federal
ePrivacy Directive
European Union
EU AI Act - High Risk Provisions
EU
FTC Act Section 5
United States Federal
GDPR
European Union
UK GDPR
United Kingdom

Provision details

Document information
Document
Perplexity Enterprise Terms
Entity
Perplexity AI
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 11, 2026
Record ID
CA-P-010726
Document ID
CA-D-00762
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
83fa458f3392c69658ad47cc4300f2a58755f20eb0acaf2a3490f9ce3bb6aab6
Analysis generated
May 11, 2026 13:26 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Perplexity AI
Document: Perplexity Enterprise Terms
Record ID: CA-P-010726
Captured: 2026-05-11 13:26:14 UTC
SHA-256: 83fa458f3392c696…
URL: https://conductatlas.com/platform/perplexity-ai/perplexity-enterprise-terms/de-identified-usage-data-for-model-improvement/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
Medium
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Perplexity AI's De-Identified Usage Data for Model Improvement clause do?

While the data is described as de-identified and aggregated, enterprise customers should understand that usage patterns from their organization may contribute to Perplexity's AI model training and product development. The practical effectiveness of de-identification depends on implementation, which the agreement does not detail.

How does this clause affect you?

Enterprise employee usage data, once de-identified, may be used by Perplexity to improve its AI models under this provision. The agreement asserts this data cannot identify Customer or individual users, but the specific de-identification standards and technical measures are not described in the terms.

Is ConductAtlas affiliated with Perplexity AI?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Perplexity AI.