AI21 Labs · AI21 Labs Privacy Policy · View original document ↗

Prompt and Interaction Data Collection and Model Training Use

Medium severity Medium confidence Inferredfromcontext Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for AI21 Labs Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

When you use AI21's products, the text you type into the AI and the responses you receive may be saved and used to improve AI21's AI models. Enterprise API customers have separate contractual protections that may limit this use.

This analysis describes what AI21 Labs's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

Your prompts may contain sensitive, personal, or confidential information, and understanding that this content could be used for model training is important for deciding what to share with AI21's products.

Interpretive note: The exact scope of model training use and the conditions under which it applies to specific product tiers are not fully detailed in the available document excerpt, creating interpretive uncertainty.

Consumer impact (what this means for users)

Consumer product users' prompts and AI-generated outputs may be retained and used for model training and improvement, while enterprise API customers are governed by a separate data processing agreement that may restrict this use. Users who inadvertently share sensitive information in prompts should be aware that this content is collected and potentially used beyond the immediate interaction.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Email privacy@ai21.com to request deletion of your personal data including prompt history. Identify yourself and specify the data you want deleted. AI21 is required to respond within the timeframe applicable to your jurisdiction.

How other platforms handle this

Windsurf Medium

We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...

Writer Medium

Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.

Ideogram Medium

We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.

See all platforms with this clause type →

Monitoring

AI21 Labs has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
When you use our Services, we collect the prompts you submit and the outputs generated by our AI models. We may use this information to improve our Services, including training and fine-tuning our AI models. If you are an enterprise customer using our API, your data is handled pursuant to the applicable data processing agreement between AI21 and your organization.

— Excerpt from AI21 Labs's AI21 Labs Privacy Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision engages GDPR Article 5 (purpose limitation and data minimisation), Article 6 (lawful basis for processing), and Article 13 (transparency obligations). The use of prompt data for model training may require a clear lawful basis, and if legitimate interests is asserted, a balancing test is required. CCPA and CPRA also apply to California residents' prompt data as personal information. The EU AI Act may impose additional obligations on high-risk AI system training data practices. GOVERNANCE EXPOSURE: High. The use of user-submitted prompts for model training creates significant compliance exposure because users may submit sensitive, confidential, or special-category data (health information, financial details, legal matters) without realizing it could be retained and used for training. The policy's distinction between consumer and enterprise users is operationally important but may not be consistently communicated at the point of data entry. JURISDICTION FLAGS: EU and EEA users are protected by GDPR purpose limitation requirements, which may constrain the use of prompts for model training without explicit consent or a clearly documented legitimate interest assessment. California residents have CCPA rights to know and opt out of certain uses. UK GDPR applies equivalent restrictions for UK users. Enterprise customers in regulated sectors (healthcare, legal, financial) face heightened exposure if employees submit sector-specific data via consumer-facing interfaces rather than enterprise API endpoints. CONTRACT AND VENDOR IMPLICATIONS: Enterprise procurement teams should request and review the applicable data processing agreement to confirm that prompt data is not used for model training without explicit authorization. The policy's reference to a separate enterprise agreement creates a material distinction that should be reflected in vendor assessment documentation. B2B contracts should specify which product tier is being used and which data handling regime applies. COMPLIANCE CONSIDERATIONS: Legal teams should evaluate whether the current consent or notice mechanism adequately discloses model training use at the point of prompt submission. Data mapping exercises should distinguish between consumer and API data flows. If the organization operates in a regulated sector, a data protection impact assessment (DPIA) may be warranted before deploying AI21 consumer products for business use.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive data practices for US users, including whether AI21's disclosures about prompt data use for model training are adequately clear and accurate
    File a complaint →

Applicable regulations

EU AI Act
European Union
California AB 2013 AI Training Data Transparency
US-CA
Colorado AI Act
US-CO
EU AI Act - High Risk Provisions
EU
GDPR
European Union
Texas AI Act
Texas, USA
Trump Executive Order on AI Policy Framework
US

Provision details

Document information
Document
AI21 Labs Privacy Policy
Entity
AI21 Labs
Document last updated
May 5, 2026
Tracking information
First tracked
April 30, 2026
Last verified
May 10, 2026
Record ID
CA-P-008134
Document ID
CA-D-00460
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
4abc7ff0d7779bee955894a99670d17aadf5332ce2786437f3a3b85a2497adc3
Analysis generated
April 30, 2026 06:15 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: AI21 Labs
Document: AI21 Labs Privacy Policy
Record ID: CA-P-008134
Captured: 2026-04-30 06:15:21 UTC
SHA-256: 4abc7ff0d7779bee…
URL: https://conductatlas.com/platform/ai21-labs/ai21-labs-privacy-policy/prompt-and-interaction-data-collection-and-model-training-use/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
Medium
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does AI21 Labs's Prompt and Interaction Data Collection and Model Training Use clause do?

Your prompts may contain sensitive, personal, or confidential information, and understanding that this content could be used for model training is important for deciding what to share with AI21's products.

How does this clause affect you?

Consumer product users' prompts and AI-generated outputs may be retained and used for model training and improvement, while enterprise API customers are governed by a separate data processing agreement that may restrict this use. Users who inadvertently share sensitive information in prompts should be aware that this content is collected and potentially used beyond the immediate interaction.

Is ConductAtlas affiliated with AI21 Labs?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by AI21 Labs.