Microsoft · Microsoft Privacy Statement (Legacy)

AI and Copilot Data Use for Product Improvement

High severity
Share 𝕏 Share in Share 🔒 PDF

What it is

When you use Microsoft's AI tools like Copilot, what you type or say (your prompts) and the AI's responses may be collected and used by Microsoft to improve its AI systems, with an opt-out available in some products.

Consumer impact (what this means for users)

Consumers who use Copilot or other Microsoft AI features should know that their chat prompts and AI responses may be stored and used to train Microsoft AI models unless they actively opt out through product-specific settings.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Opt Out of Arbitration
    Go to account.microsoft.com/privacy, navigate to Privacy Settings, and review AI/Copilot-specific controls. In applicable products (e.g., Copilot in Windows or Microsoft 365), locate the AI model training opt-out toggle and disable it.

Cross-platform context

See how other platforms handle AI and Copilot Data Use for Product Improvement and similar clauses.

Compare across platforms →
Need full compliance memos? See Professional →

Why it matters (compliance & risk perspective)

Your AI conversations with Microsoft tools may contain sensitive personal, professional, or confidential information, and this data could be reviewed by Microsoft employees or used to train AI models.

View original clause language
When you use AI features, we collect your prompts and the AI-generated results, as well as related product usage data. We use this data to operate and improve AI features. Depending on the product, Microsoft may also use this data to train and improve AI models. We will tell you in the product if Microsoft uses your data for AI model training and give you controls to turn off this use. You can also review our product-specific privacy details below for more information.

Institutional analysis (Compliance & legal intelligence)

REGULATORY FRAMEWORK: This provision implicates GDPR Arts. 5(1)(b) (purpose limitation), 5(1)(c) (data minimisation), 6(1)(a/f) (lawful basis — consent or legitimate interests), 13/14 (transparency), and 22 (automated decision-making); EU AI Act Art. 10 (data governance for high-risk AI systems) and Art. 13 (transparency obligations); CCPA/CPRA §1798.100 and §1798.121 (right to opt out of sharing for cross-context behavioral advertising); and FTC Act Section 5 (unfair or deceptive practices). Enforcement authorities include EU supervisory authorities (lead DPC Ireland), ICO (UK), California CPPA, and FTC.

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Applicable agencies

  • FTC
    FTC has jurisdiction over unfair or deceptive data practices involving AI training data collection and use under FTC Act Section 5.
    File a complaint →

Provision details

Document information
Document
Microsoft Privacy Statement (Legacy)
Entity
Microsoft
Document last updated
March 5, 2026
Tracking information
First tracked
April 28, 2026
Last verified
April 28, 2026
Record ID
CA-P-003850
Document ID
CA-D-00001
Evidence Provenance
Source URL
Wayback Machine
SHA-256
9e697464d17b7148c787f07099c60e30370abb2b13a7f2a910f607e31ec13158
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: Microsoft | Document: Microsoft Privacy Statement (Legacy) | Record: CA-P-003850
Captured: 2026-04-28 08:11:57 UTC | SHA-256: 9e697464d17b7148…
URL: https://conductatlas.com/platform/microsoft/microsoft-privacy-statement-legacy/ai-and-copilot-data-use-for-product-improvement/
Accessed: April 29, 2026
Classification
Severity
High
Categories

Other provisions in this document