AI21 Labs can use the text or data you send through their services to train and improve their AI models, unless your enterprise contract specifically says they cannot.
Your inputs — including potentially sensitive business data or personal information — may be used by AI21 to train their AI models unless you negotiate a specific contractual prohibition, which is only realistically available to enterprise customers.
Cross-platform context
See how other platforms handle Training Data Use from User Inputs and similar clauses.
Compare across platforms →Most users — particularly individual developers and small businesses — will not have a separate enterprise agreement, meaning their prompts, data, and interactions become training material for AI21's models without per-session consent.
(1) REGULATORY FRAMEWORK: This provision implicates GDPR Art. 6 (lawful basis for processing personal data for training purposes — legitimate interest or consent), Art. 5(1)(b) (purpose limitation), and Art. 13 (transparency at collection). The EU AI Act (Regulation 2024/1689) Art. 53 imposes transparency and documentation requirements on GPAI model providers using third-party data for training. CCPA §1798.120 grants California residents the right to opt out of the sale or sharing of personal information, which may be triggered depending on how training data flows are characterized. Enforced by EU national supervisory authorities (e.g., ICO in UK, CNIL in France) and the California Privacy Protection Agency. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.