Google employees and contractors can read your Gemini conversations, including the prompts you send and the responses you receive, to improve the AI system.
Many users assume AI conversations are private and automated — this provision confirms that real people may read what you type to Gemini, including sensitive disclosures.
Human annotation of AI training data implicates GDPR Article 9 where special category data is incidentally submitted, and raises EU AI Act transparency obligations for general-purpose AI model training. Organisations deploying Gemini in regulated contexts (healthcare, legal, financial services) face significant data handling risk.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.
Your conversations with Gemini, including any personal or sensitive information you share, may be reviewed by Google employees and used to train AI models. Chat history is retained for up to 18 months by default, and Google may use this data across its broader ecosystem of services. You can delete your Gemini conversation history by visiting myactivity.google.com and disabling or clearing Gemini Apps Activity.