This analysis describes what Cursor's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision directly authorizes use of a developer's source code and prompts as training data, which could encompass proprietary or confidential material if Privacy Mode is not actively enabled.
With Privacy Mode disabled, the document states Cursor may store and use codebase data, prompts, editor actions, and code snippets to improve AI features and train models, and may share prompts and telemetry with third-party inference providers. Users who supply their own API keys should note the document states their requests still route through Cursor's backend for prompt building, regardless of API key source. You can enable Privacy Mode in Cursor's settings to apply zero data retention at the model-provider level and prevent your code from being used as training data.
How other platforms handle this
We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...
Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.
We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.
Monitoring
Cursor has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"If you choose to turn off "Privacy Mode": we may use and store codebase data, prompts, editor actions, code snippets, and other code data and actions to improve our AI features and train our models.— Excerpt from Cursor's Cursor Data Use & Privacy Overview
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision directly authorizes use of a developer's source code and prompts as training data, which could encompass proprietary or confidential material if Privacy Mode is not actively enabled.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cursor.