Unless you turn off Gemini Apps Activity in your account settings, Google uses your conversations to train and improve its AI systems.
Reversed the previous statement by now clarifying that turning off Gemini Apps Activity DOES prevent conversations from being used to train AI models, contradicting the previous 'doesn't prevent' language.
View full change record →This provision means that conversations you have with Gemini, including any personal, medical, financial, or sensitive information, may be used to train Google's AI models. While you can opt out via account settings, conversations already used for training cannot be retroactively removed from model weights.
Cross-platform context
See how other platforms handle AI Model Training Use of Conversations and similar clauses.
Compare across platforms →Your conversations — including sensitive questions or personal disclosures — may permanently influence Google's AI model weights, meaning the substance of your interactions could persist in the model even after your conversation data is deleted.
(1) REGULATORY FRAMEWORK: Implicates GDPR Art. 6(4) (compatibility test for secondary processing), Art. 9 (explicit consent required for training on special category data), Art. 22 (automated decision-making with significant effects), and Recital 47 (legitimate interests balancing). EU AI Act Title IV transparency obligations apply to general-purpose AI models. FTC Act Section 5 applies to undisclosed secondary uses of consumer data. CCPA §1798.120 (right to opt out of sale/sharing) may apply if model training constitutes data sharing. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.