Depending on where you live and the laws that apply in your country of residence, you may enjoy certain rights regarding your personal data, as described further below. However, please be aware that these rights are limited, and that the process by which we may need to action your requests regarding our training dataset are complex. We may also decline a request if we have a lawful reason for doing so.
The acknowledgment that training dataset requests are 'complex' and may be declined signals that data deletion and correction rights are not fully enforceable in practice for data already incorporated into Claude's model weights, which may conflict with GDPR and CCPA deletion obligations.
Anthropic collects your conversation content (prompts and AI responses), device identifiers, browsing behavior, and any personal data you include in messages to Claude, and may use this data to train its AI models. Users who opt out of model training should be aware that conversations flagged for safety or policy review can still be used for training without their consent, representing a meaningful limitation on the opt-out right. You can opt out of having your conversations used for model training by navigating to your Claude.ai account settings, or submit a data deletion request by emailing privacy@anthropic.com.