Microsoft
· Microsoft Privacy Statement (Legacy)
Prompts you send to Copilot or other AI features may contain sensitive personal or professional information, and the statement indicates this interaction data can be retained and used for model improvement, which is a materially different use than simply answering your question.
Users interacting with AI features may not realize that their prompts and AI-generated responses can be collected and used for product improvement, which could include sensitive or confidential content depending on how the feature is used.
Microsoft
· Microsoft Privacy Statement (Legacy)
Your AI conversations with Microsoft tools may contain sensitive personal, professional, or confidential information, and this data could be reviewed by Microsoft employees or used to train AI models.
Microsoft
· Microsoft Privacy Statement (Legacy)
This provision discloses that content you type, speak, or share with Copilot and other AI features may be retained and used for product improvement purposes, which may include use beyond the immediate interaction.
Your professional content and behavior on LinkedIn may be used to build AI systems without your active knowledge, and the default setting may be opted in rather than opted out.
AI training data use is a significant and relatively recent category of data processing that extends beyond traditional service delivery purposes; the opt-out rather than opt-in structure means your data is used by default.
This provision authorizes LinkedIn to use the professional content and data you contribute to the platform to develop and improve AI products, including sharing with its parent company Microsoft, which may extend the use of your data beyond the LinkedIn platform itself.
This provision means your professional history, opinions, posts, and behavioral data may be permanently incorporated into AI models, with real implications for how your data is used beyond your direct interactions with LinkedIn.
This license is broad and perpetual, meaning LinkedIn can use your professional content, name, image, and likeness to train AI models even after you delete it from the platform if it has already been shared.
Your personal browsing habits, pins, and interactions could be used to train AI systems, which raises concerns about consent and the long-term use of your data beyond the original purpose.
Slack
· Slack Privacy Policy
Your workplace messages could potentially be used to train Slack's AI models, and whether this happens depends on your employer's contract settings — individual users typically have no direct control over this.
Yelp
· Yelp Privacy Policy
This means your personal data and content contributions could be used to train AI models, potentially without a clear way to opt out, and may be shared with third-party AI companies whose own privacy practices apply.
Your files and behavior on Dropbox could be used to develop AI systems, raising questions about consent, data minimization, and the scope of use beyond storage.
This authorization permits Atlassian to use customer-submitted content and usage data for AI model training and product improvement purposes, which may be material for organizations with confidential data, regulated data, or sector-specific data handling obligations.
Automated employment matching systems can encode bias and affect your job opportunities without human review, and the policy does not describe an explicit opt-out mechanism for AI-driven profiling.
Zoom
· Zoom Privacy Statement
This provision determines whether the content of your meetings, including things you say, type, or share, may be used to improve Zoom's AI products. Because the opt-out is assigned to account administrators rather than individual users or participants, individuals who join meetings on accounts they do not control cannot directly manage this setting.
Using customer financial and behavioral data to train AI systems without a clear opt-out is an emerging area of regulatory concern and may constitute secondary processing beyond what users reasonably expect when signing up for a payments app.
Using consumer data to train AI models is an emerging area of regulatory scrutiny globally, and consumers generally do not expect their personal interactions to permanently inform commercial AI systems, particularly without explicit consent.
Your personal content and interactions on Pinterest may be used to build AI systems without a standalone, specific opt-out mechanism, raising concerns about intellectual property, consent, and the scope of data use beyond the original purpose.
Your financial transactions, behavioral patterns, and personal information may be permanently incorporated into AI models that affect how Cash App treats you and other users, with no opt-out mechanism disclosed for this specific use.
Using personal data for AI model training raises questions about consent, data minimization, and the purposes for which data was originally collected, particularly under GDPR and emerging AI governance frameworks.
Roblox
· Roblox Privacy and Cookie Policy
Your messages, audio, and other content on Roblox may be used to train artificial intelligence systems, which is a significant secondary use of your personal data you may not have anticipated.
This provision authorizes X to use broad categories of personal data, including content you create and how you interact with the platform, to develop and improve AI systems, which is a use that may extend beyond what users typically anticipate from a social media service.
This provision means personal data you provide, or that Thomson Reuters collects about you, could be used to build AI systems, raising questions about what data is used, for how long, and whether individuals have effective control over that use.
The text you submit to Grammarly may include sensitive personal, professional, or confidential information, and using it for AI training goes significantly beyond the core service of providing writing suggestions.
The authorization to use personal data for AI training is explicit and broad, and the notice does not describe limits on which data categories may be used for this purpose or how long AI-trained models derived from user data are retained.
Your professional data and behavior on LinkedIn may be used to build AI systems that affect how content, jobs, and people are ranked and recommended across the platform.
Using personal data for AI model training is a non-obvious secondary purpose that goes beyond service delivery — it can affect how AI systems behave and who benefits from your data contributions.
Business users managing confidential client projects inside ClickUp may unknowingly contribute that content to AI training datasets, potentially conflicting with their own confidentiality obligations.
This restriction places the legal and ethical burden of obtaining consent directly on the customer, and failure to comply constitutes a breach of agreement that can trigger immediate suspension and indemnification obligations.