Windsurf uses your code completions to train ranking models (not code-generating models) by default, but you can opt out of this in your account settings without losing Autocomplete functionality.
This analysis describes what Windsurf's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Unlike Chat data training, opting out of Autocomplete data training does not result in loss of service access, providing users a meaningful choice. The agreement also explicitly commits to not using Autocomplete data to train generative models.
Autocomplete User Content is used for discriminative model training by default, with anonymization stated. Users can opt out via account settings without losing Autocomplete service access, which is a less restrictive condition than the Chat opt-out mechanism.
How other platforms handle this
We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.
After registration, you may create, upload or transmit files, documents, videos, images, data or information as part of your use of the Service (collectively, "User Content"). This includes any inputs you provide to our AI-powered support tools and outputs generated in response to your inputs. User ...
When you use AI features of the Services, you acknowledge that your inputs may be processed by third-party AI providers. ClickUp may use anonymized and aggregated data derived from your use of the Services to improve and train AI models and features.
Monitoring
Windsurf has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"We may use your Autocomplete User Content to improve our discriminative machine learning models, which are models that rank or assign scores to code generations in order to understand the boundaries between different sets of code. We will never use your Autocomplete User Content to improve generative machine learning models, which are models that are able to generate code directly based on studying existing code generations, for Autocomplete or other services. Any Autocomplete User Content used for training our discriminative machine learning models is anonymized, such that any personally identifiable information is removed. To opt out of having your Autocomplete User Content used for such purpose, you may change the code sharing options in the User Settings pane of the user's profile page. Please note that if you opt out, your Autocomplete User Content will be sent to our servers so that we are able to provide you with the Services, but we will not retain your Autocomplete User Content on our servers for training our discriminative machine learning models.— Excerpt from Windsurf's Windsurf Terms of Service
(1) REGULATORY LANDSCAPE: This provision engages GDPR's consent and legitimate interests grounds for processing under Article 6, and CCPA provisions regarding use of personal information for training purposes. The explicit commitment not to use Autocomplete data for generative model training is a notable operational distinction that compliance teams should document. The FTC's guidance on AI and data practices is relevant to the stated anonymization process. (2) GOVERNANCE EXPOSURE: Medium. The opt-out mechanism is less restrictive than the Chat equivalent, and the document draws a clear distinction between discriminative and generative model use. However, the anonymization process is asserted rather than independently audited, and 'discriminative model training' may still involve processing of code patterns that could be commercially sensitive. (3) JURISDICTION FLAGS: EU users should evaluate whether default enrollment in Autocomplete training satisfies GDPR's consent or legitimate interests requirements. California users should assess CCPA applicability. The explicit carve-out from generative model training is relevant to EU AI Act risk classification considerations. (4) CONTRACT AND VENDOR IMPLICATIONS: Enterprise procurement teams should note the stated distinction between discriminative and generative model training in vendor assessments, and verify that proprietary code submitted via Autocomplete is covered by the anonymization commitment. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should document the opt-out election in records of processing activities under GDPR Article 30, and confirm that the account settings opt-out mechanism functions as described.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Unlike Chat data training, opting out of Autocomplete data training does not result in loss of service access, providing users a meaningful choice. The agreement also explicitly commits to not using Autocomplete data to train generative models.
Autocomplete User Content is used for discriminative model training by default, with anonymization stated. Users can opt out via account settings without losing Autocomplete service access, which is a less restrictive condition than the Chat opt-out mechanism.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Windsurf.