If Privacy Mode is off, Cursor states it may use your codebase data, prompts, editor actions, and code snippets to train its AI models, and may share prompts and limited telemetry with third-party model providers you select.
This analysis describes what Cursor's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision establishes the full scope of data use when Privacy Mode is disabled, authorizing collection and use of codebase data, prompts, and editor actions for AI model training and disclosure to third-party model providers.
With Privacy Mode off, the document authorizes Cursor to store and use codebase data, prompts, editor actions, and code snippets for AI training purposes, and to share prompts and limited telemetry with third-party model providers when those providers are explicitly selected by the user.
How other platforms handle this
We may share your information with third-party advertising partners to provide you with targeted advertising. We also work with third-party analytics providers who help us understand how users interact with our Services. These third parties may use cookies, web beacons, and similar tracking technolo...
We process personal data you provide to Oura to enable third party integrations, services, features, and offerings. For example, with your permission, our Services may integrate with third-party services like Google Health Connect and Apple HealthKit, or those of our partners. Oura takes measures to...
Creators: when you subscribe to a Creator's publication, we provide them the information necessary (including your name and email address) to provide you their publication(s). Please note that Creators control their own publications; accordingly, when you interact with a Creator's publication in a w...
Monitoring
Cursor has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"If you choose to turn off "Privacy Mode": we may use and store codebase data, prompts, editor actions, code snippets, and other code data and actions to improve our AI features and train our models. Prompts and limited telemetry may also be shared with model providers when you explicitly select their models.— Excerpt from Cursor's Cursor Data Use & Privacy Overview
(1) REGULATORY LANDSCAPE: This provision implicates GDPR Articles 6, 9, and 28, particularly the lawful basis for using code data (which may contain personal data) for AI model training and the obligations applicable to sharing that data with third-party processors or controllers. CCPA and CPRA are relevant regarding disclosure of data shared with third parties and any right to opt out of sale or sharing. The FTC Act applies to the adequacy of disclosure. EU/EEA users may require a GDPR-compliant legal basis for AI training use of personal data contained in prompts or code. (2) GOVERNANCE EXPOSURE: High. The authorization to train AI models on codebase data, prompts, and editor actions is broad in scope and may include proprietary source code, authentication logic, or data structures. The document does not specify retention periods or data minimization practices applicable when Privacy Mode is off. (3) JURISDICTION FLAGS: EU/EEA users face heightened exposure given GDPR requirements for explicit lawful basis for processing personal data for AI training. California residents may evaluate this under CPRA's right to opt out of sharing personal information. Enterprises in regulated industries (financial services, healthcare, legal) should assess whether this provision is compatible with their confidentiality and data handling obligations. (4) CONTRACT AND VENDOR IMPLICATIONS: Organizations deploying Cursor for professional software development should assess whether the Privacy Mode off state is consistent with their source code confidentiality obligations, client contracts, or regulatory requirements. The reference to 'limited telemetry' shared with model providers is not further defined in this document. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should verify the default Privacy Mode state for all organizational accounts, confirm what 'limited telemetry' encompasses by reviewing cursor.com/security, and assess whether data processing agreements with named model providers are in place for EU data flows.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
ConductAtlas detected a major restructuring of Meta’s privacy policy that removed detailed consumer rights disclosures and relocated them to separate documents.
Your genetic data may be transferred to a new owner as a business asset. Here is what the Terms of Service actually say and what you can do right now.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision establishes the full scope of data use when Privacy Mode is disabled, authorizing collection and use of codebase data, prompts, and editor actions for AI model training and disclosure to third-party model providers.
With Privacy Mode off, the document authorizes Cursor to store and use codebase data, prompts, editor actions, and code snippets for AI training purposes, and to share prompts and limited telemetry with third-party model providers when those providers are explicitly selected by the user.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cursor.