Perplexity can use anonymized data about how your employees use the service to improve its AI systems, as long as that data cannot be traced back to your company or any individual.
This analysis describes what Perplexity AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
While the data is described as de-identified and aggregated, enterprise customers should understand that usage patterns from their organization may contribute to Perplexity's AI model training and product development. The practical effectiveness of de-identification depends on implementation, which the agreement does not detail.
Interpretive note: The exact language and scope of the de-identification provision was not directly extractable from the truncated HTML. The provision described reflects common structures in enterprise AI SaaS agreements; the actual terms may include more specific standards or opt-out mechanisms not captured here.
Enterprise employee usage data, once de-identified, may be used by Perplexity to improve its AI models under this provision. The agreement asserts this data cannot identify Customer or individual users, but the specific de-identification standards and technical measures are not described in the terms.
How other platforms handle this
We may de-identify, anonymize, or aggregate information we collect so the information cannot reasonably identify you or your device, or we may collect information that is already in de-identified form. For example, we may disclose performance benchmark data and other aggregated, anonymized, or de-id...
We use your personal information to personalize your experience with our products and services, improve and develop new features and products, conduct research and analytics, and to send you communications about products and services that may interest you.
We may use and share de-identified or aggregated information for any purpose, including research and analytics. We maintain and use de-identified data without attempting to re-identify it.
Monitoring
Perplexity AI has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Perplexity may collect and use aggregated and de-identified data derived from Customer's and Authorized Users' use of the Service for purposes of improving, developing, and enhancing the Service and Perplexity's AI models, provided that such data does not identify Customer or any individual user.— Excerpt from Perplexity AI's Perplexity Enterprise Terms
(1) REGULATORY LANDSCAPE: This provision engages GDPR's requirements for lawful processing of personal data and the standards for effective anonymization (which GDPR guidance establishes must be irreversible). Under GDPR, data that is merely pseudonymized rather than truly anonymized remains subject to data protection obligations. CCPA's definition of de-identified data and associated obligations for businesses using de-identified data are also relevant. The EU AI Act may impose additional transparency requirements regarding training data sourcing. (2) GOVERNANCE EXPOSURE: Medium. The provision is common in SaaS AI agreements but the absence of specific de-identification standards in the visible document text creates compliance uncertainty for enterprise customers subject to GDPR. If de-identification is not sufficiently robust, this provision could constitute processing of personal data without an adequate legal basis. (3) JURISDICTION FLAGS: EU and EEA enterprise customers face the highest exposure, as GDPR's anonymization standards are more stringent than US standards and the legal basis for processing employee data for AI model improvement is not straightforward under Article 6. California enterprise customers should evaluate whether the provision satisfies CCPA's de-identification requirements. (4) CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers should request specification of the de-identification methodology in the data processing addendum or a separate technical annex. Procurement teams should confirm whether this usage constitutes Perplexity acting as a data controller for model improvement purposes, which would affect the data processing agreement structure. (5) COMPLIANCE CONSIDERATIONS: Data protection officers at enterprise customers should conduct a data protection impact assessment for this use case, particularly for EU deployments. Internal data governance policies should inform employees that query data may be used in de-identified form for AI model improvement.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
We read the privacy policies and terms of service of 38 AI platforms. Here is what they say about training, retention, arbitration, and liability.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
While the data is described as de-identified and aggregated, enterprise customers should understand that usage patterns from their organization may contribute to Perplexity's AI model training and product development. The practical effectiveness of de-identification depends on implementation, which the agreement does not detail.
Enterprise employee usage data, once de-identified, may be used by Perplexity to improve its AI models under this provision. The agreement asserts this data cannot identify Customer or individual users, but the specific de-identification standards and technical measures are not described in the terms.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Perplexity AI.