Figma · Figma Privacy Policy · View original document ↗

AI Feature Content Used for Model Training

High severity Medium confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Figma Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Figma may use the content you submit through its AI tools to train and improve its AI systems. If you are on a paid Professional or Organizational plan, you can opt out of this use in your account settings.

This analysis describes what Figma's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

Design files submitted to Figma's AI features may contain proprietary business information, client work, or sensitive intellectual property, and this clause authorizes Figma to use that material to improve its AI unless users take affirmative steps to opt out.

Interpretive note: The adequacy of the opt-out framing as a legal basis under GDPR, and whether free-plan users have any equivalent right, is not fully resolved by the document language and may depend on regulatory interpretation.

Consumer impact (what this means for users)

For individual users on free plans, this clause means their AI-submitted content may be used for AI training without a clear opt-out mechanism. Professional and organizational plan holders have an opt-out right, but it requires active exercise to prevent this use.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Opt Out of Arbitration
    Log into your Figma account, navigate to Settings, locate the AI data training or privacy section, and disable the option that permits your content to be used for AI model training. Organization admins should review organization-level settings as well.

How other platforms handle this

Ideogram Medium

We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.

ClickUp Medium

When you use AI features of the Services, you acknowledge that your inputs may be processed by third-party AI providers. ClickUp may use anonymized and aggregated data derived from your use of the Services to improve and train AI models and features.

Windsurf Medium

We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...

See all platforms with this clause type →

Monitoring

Figma has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
We may use Customer Data and other information we collect to train, fine-tune, and improve our AI/ML models, including our AI features. For customers on Professional or Organizational plans, we provide the ability to opt out of having your Content used to train our AI models. If you opt out, we will not use your Content for this purpose.

— Excerpt from Figma's Figma Privacy Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision may require evaluation under GDPR Articles 5, 6, and 13, which require that personal data be processed for specified, explicit, and legitimate purposes and that data subjects be clearly informed of processing purposes at collection. The use of content for AI training may be assessed as a secondary purpose requiring a compatibility analysis or a separate valid legal basis. The UK Information Commissioner's Office and EU data protection authorities have issued guidance on AI training data that compliance teams should consult. The FTC also has authority over unfair or deceptive data practices in the US. GOVERNANCE EXPOSURE: High. The use of customer design content for AI model training raises significant questions about whether legitimate interests or contractual necessity provide adequate GDPR legal bases, particularly where the content belongs to organizational customers who may have their own data protection obligations to their clients. Enterprise agreements that do not explicitly address AI training use may create gaps between contractual commitments and policy permissions. JURISDICTION FLAGS: EU and UK users face the highest exposure given GDPR and UK GDPR requirements for clear legal bases and purpose limitation. California users may have CCPA rights implicated if AI training constitutes a use of personal information beyond the original collection purpose. Organizations in healthcare, financial services, or legal sectors may face additional regulatory constraints on permitting third-party AI training use of client-related content. CONTRACT AND VENDOR IMPLICATIONS: Procurement teams evaluating Figma as a vendor should confirm that enterprise data processing agreements explicitly restrict AI training use of organizational content, or that the opt-out has been formally exercised and documented. The policy asserts an opt-out right for Professional and Organizational plans, but the mechanism and its operational reliability should be verified during vendor assessment. Liability for unauthorized use of third-party intellectual property submitted to AI features may also warrant review. COMPLIANCE CONSIDERATIONS: Compliance teams should document whether the AI training opt-out has been exercised for all relevant organizational accounts. Data protection impact assessments may be warranted for organizations processing sensitive or regulated content through Figma's AI features. Internal policies governing employee use of Figma AI tools should be reviewed to ensure alignment with organizational data governance obligations.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has jurisdiction over unfair or deceptive data practices by US companies, including representations about AI training data use and consumer opt-out mechanisms.
    File a complaint →

Applicable regulations

EU AI Act
European Union
Colorado AI Act
US-CO
GDPR
European Union
Texas AI Act
Texas, USA
UK GDPR
United Kingdom

Provision details

Document information
Document
Figma Privacy Policy
Entity
Figma
Document last updated
May 5, 2026
Tracking information
First tracked
May 8, 2026
Last verified
May 11, 2026
Record ID
CA-P-010178
Document ID
CA-D-00544
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
315fb012bac613a0c2ab4c786331faed0efcf8a6a9a30d7fb56cce37350ff08d
Analysis generated
May 8, 2026 13:38 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Figma
Document: Figma Privacy Policy
Record ID: CA-P-010178
Captured: 2026-05-08 13:38:05 UTC
SHA-256: 315fb012bac613a0…
URL: https://conductatlas.com/platform/figma/figma-privacy-policy/ai-feature-content-used-for-model-training/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Figma's AI Feature Content Used for Model Training clause do?

Design files submitted to Figma's AI features may contain proprietary business information, client work, or sensitive intellectual property, and this clause authorizes Figma to use that material to improve its AI unless users take affirmative steps to opt out.

How does this clause affect you?

For individual users on free plans, this clause means their AI-submitted content may be used for AI training without a clear opt-out mechanism. Professional and organizational plan holders have an opt-out right, but it requires active exercise to prevent this use.

Is ConductAtlas affiliated with Figma?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Figma.