When you use experimental Labs Models, your data and outputs are used for AI training by default, and any opt-out settings you configured for other Mistral AI products do not carry over. You must separately activate zero data retention within Labs Models to prevent this.
This analysis describes what Mistral AI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision creates a product-siloed consent architecture where a business that has opted out of training for its primary Mistral AI products may unknowingly contribute training data through Labs Models unless it takes a separate, affirmative action per product.
Organizations that use Labs Models for testing experimental AI capabilities may have their Customer Data and Outputs used for model training even if they have opted out of training elsewhere, because the terms state that opt-out preferences from other products do not apply to Labs Models.
How other platforms handle this
California law gives residents the right to know what personal information we collect, use, share or sell; to delete personal information under certain circumstances; to opt-out of the sale or sharing of their personal information; to correct inaccurate personal information; to limit the use and dis...
T-Mobile collects Customer Proprietary Network Information (CPNI), which is information about the quantity, technical configuration, type, destination, location, and amount of use of your service. T-Mobile may use your CPNI within its family of companies for the purpose of providing wireless telecom...
If you would like to opt out of the disclosure of your personal information for purposes that could be considered "sales" for those third parties' own commercial purposes, or "sharing" or processing for purposes of targeted advertising, please visit the following link, which is also available in the...
Monitoring
Mistral AI has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Mistral AI may provide its commercial Customers access to experimental or pre-release models identifiable by the prefix 'labs' in AI Studio ('Labs Models') for testing purposes. Labs Models are exclusively available to Mistral AI's commercial Customers and are restricted to professional use only. By using Labs Models, you acknowledge that (i) Mistral AI may use Customer Data and Outputs generated from Labs Models to train its artificial intelligence models, unless you have activated zero data retention, and (ii) the opt-out preferences you selected for other Mistral AI Products does not apply to Labs Models. If you do not want your data or outputs used for training, do not use Labs Models.— Excerpt from Mistral AI's Mistral AI Commercial Terms
(1) REGULATORY LANDSCAPE: This provision engages GDPR requirements on granular, specific consent and transparency, particularly where different product contexts have different data processing implications. The product-siloed opt-out structure may require evaluation under GDPR's principle of purpose limitation and transparency requirements. The CNIL and relevant EU supervisory authorities are the enforcement bodies. The EU AI Act's data governance provisions for AI training datasets may also be engaged. (2) GOVERNANCE EXPOSURE: High. The explicit statement that opt-out preferences from other products do not apply to Labs Models means that organizations cannot rely on a single account-level configuration to govern training data use across all Mistral AI products. This creates operational compliance risk for organizations that deploy multiple Mistral AI products. (3) JURISDICTION FLAGS: EU/EEA organizations face the most direct exposure given GDPR's requirements on informed, specific consent for personal data processing. Organizations deploying Labs Models in regulated sectors should conduct a data protection impact assessment to assess whether experimental model usage creates disproportionate risk relative to the testing benefit. (4) CONTRACT AND VENDOR IMPLICATIONS: Procurement and vendor management teams should flag Labs Models as a product requiring separate data governance controls at contract review. Organizations reselling or building Customer Offerings on top of Mistral AI products should ensure their own customer disclosures reflect the Labs Models training data use to avoid downstream liability. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should maintain a product-level registry of Mistral AI products in use and the corresponding data retention and opt-out configurations, verify that zero data retention is activated for Labs Models if training data use is not consented to, and assess whether End Users accessing Labs Models through Customer accounts have been adequately informed of the training data implications.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision creates a product-siloed consent architecture where a business that has opted out of training for its primary Mistral AI products may unknowingly contribute training data through Labs Models unless it takes a separate, affirmative action per product.
Organizations that use Labs Models for testing experimental AI capabilities may have their Customer Data and Outputs used for model training even if they have opted out of training elsewhere, because the terms state that opt-out preferences from other products do not apply to Labs Models.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Mistral AI.