Cursor · Cursor Data Use & Privacy Overview · View original document ↗

Privacy Mode Off: AI Training and Model Provider Data Sharing

High severity High confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Cursor Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

If Privacy Mode is off, Cursor states it may use your codebase data, prompts, editor actions, and code snippets to train its AI models, and may share prompts and limited telemetry with third-party model providers you select.

This analysis describes what Cursor's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This provision establishes the full scope of data use when Privacy Mode is disabled, authorizing collection and use of codebase data, prompts, and editor actions for AI model training and disclosure to third-party model providers.

Consumer impact (what this means for users)

With Privacy Mode off, the document authorizes Cursor to store and use codebase data, prompts, editor actions, and code snippets for AI training purposes, and to share prompts and limited telemetry with third-party model providers when those providers are explicitly selected by the user.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Opt Out of Arbitration
    Open Cursor, navigate to Settings, and enable Privacy Mode to prevent your codebase data, prompts, and editor actions from being used for AI model training.

How other platforms handle this

Lime Medium

We may share your information with third-party advertising partners to provide you with targeted advertising. We also work with third-party analytics providers who help us understand how users interact with our Services. These third parties may use cookies, web beacons, and similar tracking technolo...

Oura Medium

We process personal data you provide to Oura to enable third party integrations, services, features, and offerings. For example, with your permission, our Services may integrate with third-party services like Google Health Connect and Apple HealthKit, or those of our partners. Oura takes measures to...

Substack Medium

Creators: when you subscribe to a Creator's publication, we provide them the information necessary (including your name and email address) to provide you their publication(s). Please note that Creators control their own publications; accordingly, when you interact with a Creator's publication in a w...

See all platforms with this clause type →

Monitoring

Cursor has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
If you choose to turn off "Privacy Mode": we may use and store codebase data, prompts, editor actions, code snippets, and other code data and actions to improve our AI features and train our models. Prompts and limited telemetry may also be shared with model providers when you explicitly select their models.

— Excerpt from Cursor's Cursor Data Use & Privacy Overview

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

(1) REGULATORY LANDSCAPE: This provision implicates GDPR Articles 6, 9, and 28, particularly the lawful basis for using code data (which may contain personal data) for AI model training and the obligations applicable to sharing that data with third-party processors or controllers. CCPA and CPRA are relevant regarding disclosure of data shared with third parties and any right to opt out of sale or sharing. The FTC Act applies to the adequacy of disclosure. EU/EEA users may require a GDPR-compliant legal basis for AI training use of personal data contained in prompts or code. (2) GOVERNANCE EXPOSURE: High. The authorization to train AI models on codebase data, prompts, and editor actions is broad in scope and may include proprietary source code, authentication logic, or data structures. The document does not specify retention periods or data minimization practices applicable when Privacy Mode is off. (3) JURISDICTION FLAGS: EU/EEA users face heightened exposure given GDPR requirements for explicit lawful basis for processing personal data for AI training. California residents may evaluate this under CPRA's right to opt out of sharing personal information. Enterprises in regulated industries (financial services, healthcare, legal) should assess whether this provision is compatible with their confidentiality and data handling obligations. (4) CONTRACT AND VENDOR IMPLICATIONS: Organizations deploying Cursor for professional software development should assess whether the Privacy Mode off state is consistent with their source code confidentiality obligations, client contracts, or regulatory requirements. The reference to 'limited telemetry' shared with model providers is not further defined in this document. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should verify the default Privacy Mode state for all organizational accounts, confirm what 'limited telemetry' encompasses by reviewing cursor.com/security, and assess whether data processing agreements with named model providers are in place for EU data flows.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over consumer data practices, including the adequacy of disclosure of AI training data use and third-party data sharing by software providers.
    File a complaint →
  • State AG
    State attorneys general in California and other jurisdictions with comprehensive privacy laws may have enforcement authority over the disclosure of data sharing with model providers and the adequacy of opt-out mechanisms.
    File a complaint →

Applicable regulations

CCPA/CPRA
California, USA
Connecticut Data Privacy Act Amendments
US-CT
FTC Act Section 5
United States Federal
GDPR
European Union
Indiana Consumer Data Protection Act
US-IN
Kentucky Consumer Data Protection Act
US-KY
Universal Opt-Out Mechanism Expansion 2026
US

Provision details

Document information
Document
Cursor Data Use & Privacy Overview
Entity
Cursor
Document last updated
May 11, 2026
Tracking information
First tracked
May 11, 2026
Last verified
May 12, 2026
Record ID
CA-P-011151
Document ID
CA-D-00764
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
7bd016281b3f2dcf271223558f9511f2d93cc13a84b3a147251127ce1af62024
Analysis generated
May 11, 2026 13:09 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Cursor
Document: Cursor Data Use & Privacy Overview
Record ID: CA-P-011151
Captured: 2026-05-11 13:09:42 UTC
SHA-256: 7bd016281b3f2dcf…
URL: https://conductatlas.com/platform/cursor/cursor-data-use-privacy-overview/privacy-mode-off-ai-training-and-model-provider-data-sharing/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Cursor's Privacy Mode Off: AI Training and Model Provider Data Sharing clause do?

This provision establishes the full scope of data use when Privacy Mode is disabled, authorizing collection and use of codebase data, prompts, and editor actions for AI model training and disclosure to third-party model providers.

How does this clause affect you?

With Privacy Mode off, the document authorizes Cursor to store and use codebase data, prompts, editor actions, and code snippets for AI training purposes, and to share prompts and limited telemetry with third-party model providers when those providers are explicitly selected by the user.

Is ConductAtlas affiliated with Cursor?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cursor.