The questions and conversations you have with Perplexity AI may be used to train and improve the AI systems that power the service.
This analysis describes what Perplexity AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Search queries often contain sensitive personal, professional, or financial information, and users may not expect that a search-style interaction is contributing to AI model development.
Interpretive note: The policy does not specify whether training applies to raw identifiable query data or anonymized/aggregated data, which affects the practical privacy risk and the applicable regulatory obligations.
Your search queries and AI conversation history may be used as training data for Perplexity's models, which means personal details you share while searching could inform how the AI system behaves for other users or future versions of the product.
How other platforms handle this
We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.
When you use AI features of the Services, you acknowledge that your inputs may be processed by third-party AI providers. ClickUp may use anonymized and aggregated data derived from your use of the Services to improve and train AI models and features.
We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...
Monitoring
Perplexity AI has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"We may use the information we collect, including the content of your searches and interactions with our AI, to train, fine-tune, and improve our models and services.— Excerpt from Perplexity AI's Perplexity AI Privacy Policy
(1) REGULATORY LANDSCAPE: This provision engages GDPR Article 6 (lawful basis for processing), Article 9 (special category data, if queries contain health or other sensitive information), and potentially Article 22 (automated decision-making), enforced by EU data protection authorities (DPAs). Under CCPA/CPRA, use of personal information for AI training may constitute a secondary use requiring updated disclosure and, depending on implementation, may engage sensitive personal information handling rules. The FTC has signaled scrutiny of AI training data practices under its unfair or deceptive practices authority. (2) GOVERNANCE EXPOSURE: High. The provision is broad and does not specify whether training applies to raw, pseudonymized, or anonymized query data, creating ambiguity about the actual privacy risk to individual users. Given that users may enter names, locations, medical questions, legal matters, or financial details into the search interface, the potential for sensitive data to enter training pipelines is material. (3) JURISDICTION FLAGS: EU/EEA users face heightened exposure because GDPR requires a clearly identified lawful basis for AI training use of personal data, and legitimate interest may be challenged where users have not been given meaningful notice or control. California users should evaluate whether this use constitutes sharing personal information for purposes beyond the primary service, potentially triggering CPRA opt-out rights. (4) CONTRACT AND VENDOR IMPLICATIONS: Enterprise customers deploying Perplexity AI should request a Data Processing Agreement that explicitly addresses whether employee query data is excluded from AI training pipelines or whether opt-out mechanisms are available at the organizational level. The absence of explicit training data carve-outs in the standard policy is a due diligence flag for B2B procurement teams. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should assess whether a DPIA is required under GDPR for this data use, update internal data handling policies to reflect that employee queries to Perplexity may be used for AI training, and evaluate whether the current consent or notice mechanism provided to users is sufficient to support the training use as a lawful basis.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Search queries often contain sensitive personal, professional, or financial information, and users may not expect that a search-style interaction is contributing to AI model development.
Your search queries and AI conversation history may be used as training data for Perplexity's models, which means personal details you share while searching could inform how the AI system behaves for other users or future versions of the product.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Perplexity AI.