This analysis describes what Perplexity AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Search queries often contain sensitive personal information about health, finances, relationships, or legal issues — using these as AI training data without explicit opt-in consent creates real privacy risks.
Using Perplexity AI means your queries, conversation history, device data, and usage patterns are collected and may be shared with third-party service providers, advertising partners, and potentially acquirers in a corporate transaction. The policy permits use of interaction data for AI model training and improvement, which means sensitive questions submitted to the search engine may contribute to training datasets. You can exercise data access, deletion, or opt-out rights by submitting a request to privacy@perplexity.ai.
How other platforms handle this
We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.
We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...
Users under 18 years old interact with an age-appropriate model specifically designed to reduce the likelihood of exposure to sensitive or suggestive content. Our under-18 model has additional and more conservative classifiers than the model for our adult users so we can enforce our content policies...
Monitoring
Perplexity AI has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"We may use the information we collect, including the queries you submit and the content you interact with, to train, improve, and develop our artificial intelligence models and services. This includes using your search queries, feedback, and interactions to refine our models' performance and accuracy.— Excerpt from Perplexity AI's Perplexity Privacy Policy
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Search queries often contain sensitive personal information about health, finances, relationships, or legal issues — using these as AI training data without explicit opt-in consent creates real privacy risks.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Perplexity AI.