Track 1 platform and get the weekly governance digest. No credit card required.
This page describes what the document states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability may vary by jurisdiction. Methodology
This is Cursor's data use overview explaining how your code, prompts, and editor activity are handled based on which privacy setting you choose in the app. If Privacy Mode is off, Cursor states it may use your codebase data, prompts, and editor actions to train its AI models, and may share prompts with third-party model providers such as Baseten, Together AI, and Fireworks. To prevent your code from being used for training, you can enable Privacy Mode in Cursor's settings, which the document states applies zero data retention at the model-provider level.
This document is Cursor's 'Data Use & Privacy Overview,' last updated October 20, 2025, published by Anysphere, Inc., governing how user code, prompts, and related data are handled depending on the privacy settings selected within the Cursor AI code editor. The document states that with Privacy Mode enabled, zero data retention is applied at the model-provider level and no code will be trained on by Cursor or any third party; with Privacy Mode disabled, the terms authorize Cursor to use and store codebase data, prompts, editor actions, code snippets, and other code data to improve AI features and train models, and to share prompts and limited telemetry with explicitly selected model providers. A notable operational distinction is the disclosure that even when users supply their own API keys, requests still route through Cursor's backend for prompt building, which means user-supplied API key usage does not bypass Cursor's data pipeline; additionally, inference providers including Baseten, Together AI, and Fireworks may temporarily access and store model inputs and outputs when Privacy Mode is off, with deletion stated to occur after use. The document engages GDPR and CCPA frameworks given its global user base and the nature of code data, which may contain personal data in jurisdictions where source code or prompts are treated as personal information; the footnote carve-out stating that data will not be shared with model providers for accounts created before October 15, 2025, creates a class-based data treatment distinction that may require evaluation under notice and consent requirements. The document references but does not reproduce a full Privacy Policy at cursor.com/privacy and a security page at cursor.com/security, meaning this overview is not the complete governing document for data practices.
Institutional analysis available with Professional
Regulatory exposure by statute, material risk assessment, vendor due diligence action items, and enforcement precedent. Available on Professional.
Start Professional free trialMonitoring
Cursor has updated this document before.
Watcher includes same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
Professional Governance Intelligence
Need provision-level monitoring and regulatory mapping?
Professional includes governance timelines, compliance memos, audit-ready analysis, and full provision tracking.
Start Professional free trialCross-platform context
See how other platforms handle Privacy Mode Off: AI Training and Model Provider Data Sharing and similar clauses.
Compare across platforms →Governance Monitoring
Structured alerts for policy changes, governance events, and provision updates across 318+ platforms.