This is Microsoft's master privacy policy, covering every Microsoft product you use — from Windows and Xbox to Bing, Outlook, Teams, and Copilot AI — and explaining what personal data Microsoft collects and how it uses it. The most important thing to know is that Microsoft collects extensive data including your search queries, voice commands, location, browsing history, and content you create, and may use this data to personalise advertising and improve AI products like Copilot. You can review and delete much of your personal data, adjust diagnostic data settings in Windows, and manage targeted advertising preferences through the Microsoft Privacy Dashboard at account.microsoft.com/privacy.
Technical Summary
This document is Microsoft's global Privacy Statement, governing the collection, use, storage, and sharing of personal data across all Microsoft products and services — including Windows, Microsoft 365, Azure, Xbox, Bing, Cortana, and Copilot AI — with legal basis variously grounded in consent, contractual necessity, legitimate interests, and legal obligation under applicable law. The most significant obligations include Microsoft's commitment to provide users with data access, portability, correction, deletion, and objection rights, while simultaneously reserving broad rights to process personal data for product improvement, personalization, advertising, and AI model development across its entire product ecosystem. Notably, Microsoft retains voice and typed data from products like Cortana and search queries from Bing for purposes including AI training, and collects diagnostic data from Windows devices at varying levels (Required vs. Optional), creating a non-standard aggregation risk across an exceptionally wide product surface area. The statement engages GDPR (Articles 6, 13, 14, 17, 20), CCPA/CPRA (§1798.100 et seq.), COPPA (children's data provisions for family accounts and Xbox), HIPAA (where health data is involved through MSN Health or similar), and Washington State's My Health MY Data Act; material compliance considerations include the breadth of cross-service data linkage, the retention of AI interaction data, and the dual-use of diagnostic telemetry for both security and product improvement purposes. Compliance teams should note that Microsoft's data transfers outside the EEA rely on Standard Contractual Clauses and adequacy decisions, and that the statement's AI-specific data practices — particularly around Copilot — may engage the EU AI Act's transparency and data governance obligations.
Institutional Analysis
REGULATORY EXPOSURE: This statement engages GDPR Articles 6 (lawful basis), 13/14 (transparency), 17 (erasure), 20 (portability), and 22 (automated decision-making), enforceable by EU Data Protection…
REGULATORY EXPOSURE: This statement engages GDPR Articles 6 (lawful basis), 13/14 (transparency), 17 (erasure), 20 (portability), and 22 (automated decision-making), enforceable by EU Data Protection Authorities (lead authority: Irish DPC for EU operations); CCPA/CPRA §§1798.100–1798.199 enforceabl…
🔒
Compliance intelligence locked
Regulatory exposure, material risk, and due diligence action items.
✓ Snapshot stored✓ Text extracted✓ Change verified✓ Cryptographically signed
Change Timeline
April 1, 2026
— Document updated
medium
Microsoft updated how it explains data retention in its privacy statement on April 1, 2026. Previously, the policy listed specific criteria — like whether users expected data to persist or whether sensitive data types warranted shorter retention — in a more detailed, consumer-facing format. The updated version simplifies and reorganizes this section, replacing granular examples and criteria with broader categories, which may make it harder for users to understand exactly how long their data is kept.
Microsoft updated its Privacy Statement on March 13, 2026, adding new language that allows the company to contact users at phone numbers they provide for marketing purposes using auto-dialers and AI-generated or prerecorded voice messages. At the same time, they removed a sentence that had granted additional rights to users in the European Economic Area under the updated policy. The update also reflects a date change from February 2026 to March 2026, and removes a reference to a data retention policy update tied to new regulatory requirements.
Microsoft updated its Privacy Statement on March 5, 2026, to reflect new data retention rules tied to recent regulatory requirements. The update also specifically grants users in the European Economic Area (EEA) additional rights under the revised policy. This matters because it may change how long Microsoft keeps your data and expands certain legal protections for EEA residents.
Microsoft made a minor structural update to their privacy statement on March 5, 2026. The document itself did not change in terms of new rights or policies — only its organization or formatting was adjusted. This type of change typically has no direct impact on how your personal data is collected or used.
What changed
Microsoft updated their Microsoft Privacy Statement (Legacy) on April 01, 2026. Change detected: 1 sentence(s) added, 11 sentence(s) removed, 9 sentence(s) modified. Document contained 2296 sentences after update.
Consumer impact
Microsoft changed the section of its privacy policy explaining how long it keeps your personal data, replacing specific examples and criteria with broader, more general language. The updated policy removes details like the 30-day grace period after emptying your Outlook Deleted Items folder and the explicit mention of sensitive data types warranting shorter retention, making it harder to know exactly how your data is handled. You can review your data and manage retention settings directly through the Microsoft Privacy Dashboard at account.microsoft.com/privacy.
Why it matters
Microsoft removed specific details about how long it keeps your data — including a concrete 30-day window after deleting emails and explicit protections for sensitive data — replacing them with vague, general language. This makes it harder for users and regulators to hold Microsoft accountable to specific retention commitments.
What changed
Microsoft updated their Microsoft Privacy Statement on March 13, 2026. Change detected: 1 sentence(s) added, 2 sentence(s) removed, 1 sentence(s) modified. Document contained 2306 sentences after update.
Consumer impact
Microsoft has added language permitting it to use auto-dialers and AI-generated or prerecorded voice calls to reach users who consent to marketing communications at a phone number they provide. Simultaneously, the policy removed a sentence that had promised additional rights to users in the European Economic Area, which may reduce protections for those users. You can avoid these marketing calls by not providing Microsoft with your phone number or by declining to consent to marketing communications when prompted.
Why it matters
The new AI auto-dialer marketing language means Microsoft can call users with AI-generated voices if they provide a phone number and consent, which expands corporate contact rights significantly. The simultaneous removal of EEA rights language reduces previously stated protections for European users without explanation.
What changed
Microsoft updated their privacy Statement on March 05, 2026. Change detected: 2 sentence(s) added, 1 sentence(s) modified. Document contained 2307 sentences after update.
Consumer impact
Microsoft has updated its data retention policy to comply with new regulatory requirements effective March 2026, which may change how long your personal data is stored. Users in the European Economic Area now have additional rights under this updated policy, potentially including stronger controls over how their data is used or deleted. You can review Microsoft's updated Privacy Statement to understand what new rights apply to you as an EEA resident.
Why it matters
EEA users now hold additional rights that could affect how long Microsoft retains their data and what controls they can exercise over it. This change is directly tied to new regulatory requirements, making it legally significant for affected users.
What changed
Microsoft updated their privacy Statement on March 05, 2026. Change detected: minor structural change detected. Document contained 2305 sentences after update.
Consumer impact
Microsoft made a minor structural change to their privacy statement on March 5, 2026, with no apparent shift in how personal data is collected, used, or shared. The document's content appears largely unchanged in substance, meaning your existing privacy rights and data practices remain the same. No immediate action is required on your part.
Why it matters
While this appears to be a minor structural change with no immediate consumer impact, tracking even small updates to privacy statements helps users and compliance teams stay informed about evolving data practices.
Microsoft can combine data it collects from different products you use — like Bing searches, Cortana voice queries, and Copilot conversations — to improve its AI systems and personalise your experience across its entire product range.
Children under 13 need a parent or guardian to give permission before Microsoft collects their data; Microsoft also provides parental controls through Microsoft Family Safety to manage what data is collected about children.
When you use voice commands or type queries to Microsoft's AI products like Copilot, Microsoft collects and retains those inputs and uses them to improve its AI models, not just to respond to your immediate request.
Microsoft collects health-related data through products like Microsoft Health, fitness tracking, and health-related search queries, and processes this data under both this Privacy Statement and a separate Consumer Health Data Privacy Policy.
Windows automatically collects diagnostic data about your device and how you use it — at minimum a 'Required' level for security, and optionally a broader 'Optional' level that includes detailed usage patterns used to improve and personalise Windows.
Microsoft shares your search queries, IP address, location, and device identifiers with advertising partners to deliver targeted ads, though it does not share your name or email address directly with those advertisers.
You have the right to see what personal data Microsoft holds about you, correct inaccuracies, download a copy of your data, and in some circumstances ask Microsoft to delete it — all manageable through the Microsoft Privacy Dashboard.
Microsoft transfers personal data from the EU and EEA to other countries, including the US, using Standard Contractual Clauses — a legal mechanism approved by the European Commission to make such transfers lawful.