Ideogram · Ideogram Privacy Policy

AI Model Training Use of User Content

High severity
Share 𝕏 Share in Share 🔒 PDF

What it is

The text prompts you type and the images Ideogram generates for you can be used by Ideogram to train and improve its AI technology.

Consumer impact (what this means for users)

Your submitted prompts and generated images may be incorporated into Ideogram's AI training datasets, potentially influencing future model outputs, without a clear individual opt-out mechanism described in the policy.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Within 30 days
    Email Ideogram's privacy team requesting deletion of your personal data including prompts and generated images. State your account email and specify the data categories you want deleted. Response timeframes are subject to applicable law (30 days under CCPA, one month under GDPR).

Cross-platform context

See how other platforms handle AI Model Training Use of User Content and similar clauses.

Compare across platforms →
Need full compliance memos? See Professional →

Why it matters (compliance & risk perspective)

This means your creative inputs — the ideas you describe in prompts — become training material for a commercial AI system, which most users do not expect when generating images for personal use.

View original clause language
We may use the content you submit to our Services, including prompts and generated images, to develop, train, and improve our AI models and Services. By using our Services, you grant us a license to use your inputs and outputs for these purposes.

Institutional analysis (Compliance & legal intelligence)

(1) REGULATORY FRAMEWORK: This provision implicates GDPR Art. 6(1)(a) (consent) and Art. 6(1)(f) (legitimate interests) as potential legal bases for processing personal data for AI training — the European Data Protection Board has indicated that AI training does not automatically qualify as a compatible purpose under Art. 5(1)(b). CCPA §1798.140 definitions of 'sharing' and 'sale' may apply if training data is disclosed to model infrastructure partners. The EU AI Act (Regulation 2024/1689), particularly obligations on providers of general-purpose AI models under Title VIII, may require disclosure of training data sources. The FTC Act Section 5 applies where training use is not clearly disclosed at point of data collection. (2)

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive data practices under Section 5, including undisclosed use of consumer-generated content for AI model training.
    File a complaint →

Provision details

Document information
Document
Ideogram Privacy Policy
Entity
Ideogram
Document last updated
April 29, 2026
Tracking information
First tracked
May 2, 2026
Last verified
May 2, 2026
Record ID
CA-P-004442
Document ID
CA-D-00490
Evidence Provenance
Source URL
Wayback Machine
SHA-256
33f445f42f1bbf4ff46e8ff0ddf6f46772818422d079b8a43477799871ef9d50
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: Ideogram | Document: Ideogram Privacy Policy | Record: CA-P-004442
Captured: 2026-05-02 00:49:23 UTC | SHA-256: 33f445f42f1bbf4f…
URL: https://conductatlas.com/platform/ideogram/ideogram-privacy-policy/ai-model-training-use-of-user-content/
Accessed: May 2, 2026
Classification
Severity
High
Categories

Other provisions in this document