Your private conversations with Claude — including sensitive personal topics — may become training data for AI models, and the opt-out has significant exceptions that most users will not anticipate.
Anthropic collects your conversation content (prompts and outputs), device identifiers, usage patterns, and payment information when you use Claude.ai, and by default uses your conversations to train its AI models. Users who opt out of model training may still have their conversations used if those conversations are flagged for safety review, which is a significant limitation on the opt-out right. You can opt out of having your conversations used for model training by navigating to your account settings on Claude.ai, and you can delete individual conversations which are removed from your history immediately and purged from Anthropic's back-end within 30 days.
How other platforms handle this
When transferring data from the European Union, the European Economic Area, the United Kingdom, and Switzerland, Dropbox relies upon a variety of legal mechanisms, such as contracts with our customers and affiliates, Standard Contractual Clauses, the EU-U.S. Data Privacy Framework, the UK Extension ...
In connection with any reorganization, restructuring, merger or sale, or other transfer of assets, we will transfer information, including personal information, provided that the receiving party agrees to respect your personal information in a manner that is consistent with our Privacy Policy.
Depending on the context, 'you' might be an End Customer, End User, Representative, or Visitor. End Customers interact with Stripe's services through Business Users (e.g., when purchasing from a merchant). For End Customers, the Business User is the primary data controller and Stripe acts as a data ...
This clause could change without notice.
Get alerted when Anthropic updates this policy — with plain-language summaries and severity ratings.
Your genetic data may be transferred to a new owner as a business asset. Here is what the Terms of Service actually say and what you can do right now.
Don't miss changes to this clause.
Anthropic has updated this policy before. Get alerted on the next change.
Watch AnthropicYour private conversations with Claude — including sensitive personal topics — may become training data for AI models, and the opt-out has significant exceptions that most users will not anticipate.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Anthropic.