AI21 Labs · AI21 Labs Terms of Use

Training Data Use from User Inputs

High severity
Share 𝕏 Share in Share 🔒 PDF

What it is

AI21 Labs can use the text or data you send through their services to train and improve their AI models, unless your enterprise contract specifically says they cannot.

Consumer impact (what this means for users)

Your inputs — including potentially sensitive business data or personal information — may be used by AI21 to train their AI models unless you negotiate a specific contractual prohibition, which is only realistically available to enterprise customers.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Email privacy@ai21.com requesting deletion of your data and specifying that you object to use of your inputs for model training. Reference your account details and the specific data you want deleted.

Cross-platform context

See how other platforms handle Training Data Use from User Inputs and similar clauses.

Compare across platforms →
Need full compliance memos? See Professional →

Why it matters (compliance & risk perspective)

Most users — particularly individual developers and small businesses — will not have a separate enterprise agreement, meaning their prompts, data, and interactions become training material for AI21's models without per-session consent.

View original clause language
AI21 may use inputs, outputs, and other data provided through the Services to improve, train, and develop its models and Services, unless you have entered into a separate written agreement with AI21 that expressly prohibits such use.

Institutional analysis (Compliance & legal intelligence)

(1) REGULATORY FRAMEWORK: This provision implicates GDPR Art. 6 (lawful basis for processing personal data for training purposes — legitimate interest or consent), Art. 5(1)(b) (purpose limitation), and Art. 13 (transparency at collection). The EU AI Act (Regulation 2024/1689) Art. 53 imposes transparency and documentation requirements on GPAI model providers using third-party data for training. CCPA §1798.120 grants California residents the right to opt out of the sale or sharing of personal information, which may be triggered depending on how training data flows are characterized. Enforced by EU national supervisory authorities (e.g., ICO in UK, CNIL in France) and the California Privacy Protection Agency. (2)

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive practices related to AI training data use disclosures under FTC Act Section 5.
    File a complaint →

Provision details

Document information
Document
AI21 Labs Terms of Use
Entity
AI21 Labs
Document last updated
April 29, 2026
Tracking information
First tracked
April 30, 2026
Last verified
April 30, 2026
Record ID
CA-P-004119
Document ID
CA-D-00461
Evidence Provenance
Source URL
Wayback Machine
SHA-256
200410fbfe8d45547f3fd75da4be775c9976bd30375ad7c5fc3b502d2ae35721
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: AI21 Labs | Document: AI21 Labs Terms of Use | Record: CA-P-004119
Captured: 2026-04-30 06:20:21 UTC | SHA-256: 200410fbfe8d4554…
URL: https://conductatlas.com/platform/ai21-labs/ai21-labs-terms-of-use/training-data-use-from-user-inputs/
Accessed: May 2, 2026
Classification
Severity
High
Categories

Other provisions in this document