When you use Microsoft's AI services such as Copilot, your prompts, queries, and interactions may be collected and used to improve AI models and personalise your experience.
AI interactions can contain highly sensitive personal, professional, or confidential information, and users should be aware that these inputs may be stored and used to train or improve Microsoft's AI systems.
Processing of AI interaction data raises novel compliance questions under the EU AI Act, GDPR's purpose limitation and data minimisation principles, and emerging US state AI transparency laws. Compliance teams should assess whether Microsoft's AI data retention and training practices are disclosed with sufficient specificity to satisfy GDPR's transparency requirements and obtain valid consent.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.
Microsoft collects extensive personal data across its products including search history, voice recordings, location data, browsing behaviour, and inferred interests, and uses this data for targeted advertising and product improvement. Users' data may be shared with affiliates, advertising partners, and other third parties, and sensitive data such as health and biometric information may also be collected in certain contexts. You can review, download, or delete your personal data by visiting Microsoft's Privacy Dashboard at account.microsoft.com/privacy.