You and Anthropic agree to resolve any disputes through binding individual arbitration and not through a class action. This means that you and Anthropic each waive the right to a trial by jury and the right to participate in a class action lawsuit or class-wide arbitration.
Mandatory arbitration eliminates your right to sue Anthropic in court and prevents you from joining with other users in a class action, significantly reducing your practical ability to seek compensation for small-dollar harms.
Anthropic's Consumer Terms permit the company to use your conversations, inputs, and outputs to train AI models by default, with a limited opt-out that does not apply when you provide feedback or when your content is flagged for safety review. Subscriptions auto-renew and fees are non-refundable upon cancellation, meaning you could be charged for a full billing period even if you cancel. You can opt out of conversation data being used for model training by navigating to your Claude.ai account settings and disabling the training opt-out toggle.