When you use Microsoft's AI tools like Copilot, what you type or say (your prompts) and the AI's responses may be collected and used by Microsoft to improve its AI systems, with an opt-out available in some products.
Consumers who use Copilot or other Microsoft AI features should know that their chat prompts and AI responses may be stored and used to train Microsoft AI models unless they actively opt out through product-specific settings.
Cross-platform context
See how other platforms handle AI and Copilot Data Use for Product Improvement and similar clauses.
Compare across platforms →Your AI conversations with Microsoft tools may contain sensitive personal, professional, or confidential information, and this data could be reviewed by Microsoft employees or used to train AI models.
REGULATORY FRAMEWORK: This provision implicates GDPR Arts. 5(1)(b) (purpose limitation), 5(1)(c) (data minimisation), 6(1)(a/f) (lawful basis — consent or legitimate interests), 13/14 (transparency), and 22 (automated decision-making); EU AI Act Art. 10 (data governance for high-risk AI systems) and Art. 13 (transparency obligations); CCPA/CPRA §1798.100 and §1798.121 (right to opt out of sharing for cross-context behavioral advertising); and FTC Act Section 5 (unfair or deceptive practices). Enforcement authorities include EU supervisory authorities (lead DPC Ireland), ICO (UK), California CPPA, and FTC.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.