This analysis describes what Microsoft Azure's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This means conversations you have with Xbox or other Microsoft AI features are not private — they may be stored, reviewed by humans, and used to build Microsoft's AI products.
Microsoft now discloses that it may contact you by phone for marketing using automated dialers and AI-generated voices if you have consented to marketing communications, which represents a new disclo…
Microsoft's privacy policy now provides a less detailed explanation of how long your data is retained. Previously, the policy included specific examples, such as how long deleted emails remain in you…
Microsoft's updated retention policy provides greater specificity about how long your data persists and under what conditions it is deleted. The policy now explicitly states that deleted items from O…
Your voice commands and AI chat interactions on Xbox and other Microsoft services may be reviewed by Microsoft employees and used to train AI models, creating ongoing data exposure beyond the immediate interaction.
How other platforms handle this
Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.
We are simplifying our Terms of Use, including clarifications around the use of AI tools, and their data use. We have moved the terms that describe AI Features, which were previously written for a Creator audience and located under the AI-Based Tools Supplemental Terms and Disclaimer, into the User ...
We may use machine learning and other artificial intelligence (AI) technologies ("AI Technologies") to provide and improve our Service. For example, we may use such AI Technologies to analyze and process your contributions and interactions to provide you with personalized experiences, content recomm...
Monitoring
Microsoft Azure has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"When you use Microsoft products, Microsoft may collect voice data and use interactions with AI features to improve Microsoft products and services, including training and improving AI models. Voice data and AI interaction data may be reviewed by Microsoft employees and vendors.— Excerpt from Microsoft Azure's Microsoft Privacy
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This means conversations you have with Xbox or other Microsoft AI features are not private — they may be stored, reviewed by humans, and used to build Microsoft's AI products.
Your voice commands and AI chat interactions on Xbox and other Microsoft services may be reviewed by Microsoft employees and used to train AI models, creating ongoing data exposure beyond the immediate interaction.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Microsoft Azure.