When you chat with Inflection AI's products like Pi, the things you say can be used to train and improve the AI — meaning your personal conversations become part of the data that shapes future AI behavior.
Your conversation content — including any personal, health-related, emotional, or sensitive information you share with Inflection AI — may be used to train AI models, which means it is retained and processed beyond just answering your immediate question.
Cross-platform context
See how other platforms handle Conversation Data Used for AI Model Training and similar clauses.
Compare across platforms →People often share sensitive, emotional, or personal information with AI companions, and this clause means that data can be retained and used for commercial AI development purposes, potentially without users fully understanding the implications.
REGULATORY FRAMEWORK: This provision implicates GDPR Arts. 5(1)(b) (purpose limitation), 6(1) (lawful basis for processing), 9 (special category data if health, religious, or other sensitive information is inferred from conversation content), and 22 (automated decision-making). Under CCPA/CPRA §1798.140, use of personal information to train AI models may constitute a 'sale' or 'sharing' depending on whether third-party model vendors receive the data. FTC Act Section 5 applies to any deceptive representations about how conversation data is used. Enforcement authorities include FTC, EU DPAs, and California Privacy Protection Agency (CPPA).
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.