Figma may use the content you submit through its AI tools to train and improve its AI systems. If you are on a paid Professional or Organizational plan, you can opt out of this use in your account settings.
This analysis describes what Figma's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Design files submitted to Figma's AI features may contain proprietary business information, client work, or sensitive intellectual property, and this clause authorizes Figma to use that material to improve its AI unless users take affirmative steps to opt out.
Interpretive note: The adequacy of the opt-out framing as a legal basis under GDPR, and whether free-plan users have any equivalent right, is not fully resolved by the document language and may depend on regulatory interpretation.
For individual users on free plans, this clause means their AI-submitted content may be used for AI training without a clear opt-out mechanism. Professional and organizational plan holders have an opt-out right, but it requires active exercise to prevent this use.
How other platforms handle this
We may use the content you provide to us, including prompts and generated images, to train and improve our AI models and services.
When you use AI features of the Services, you acknowledge that your inputs may be processed by third-party AI providers. ClickUp may use anonymized and aggregated data derived from your use of the Services to improve and train AI models and features.
We may leverage OpenAI models independent of user selection for processing other tasks (e.g. for summarization). We may leverage Anthropic models independent of user selection for processing other tasks (e.g. for summarization). We may leverage these models independent of user selection for processi...
Monitoring
Figma has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"We may use Customer Data and other information we collect to train, fine-tune, and improve our AI/ML models, including our AI features. For customers on Professional or Organizational plans, we provide the ability to opt out of having your Content used to train our AI models. If you opt out, we will not use your Content for this purpose.— Excerpt from Figma's Figma Privacy Policy
REGULATORY LANDSCAPE: This provision may require evaluation under GDPR Articles 5, 6, and 13, which require that personal data be processed for specified, explicit, and legitimate purposes and that data subjects be clearly informed of processing purposes at collection. The use of content for AI training may be assessed as a secondary purpose requiring a compatibility analysis or a separate valid legal basis. The UK Information Commissioner's Office and EU data protection authorities have issued guidance on AI training data that compliance teams should consult. The FTC also has authority over unfair or deceptive data practices in the US. GOVERNANCE EXPOSURE: High. The use of customer design content for AI model training raises significant questions about whether legitimate interests or contractual necessity provide adequate GDPR legal bases, particularly where the content belongs to organizational customers who may have their own data protection obligations to their clients. Enterprise agreements that do not explicitly address AI training use may create gaps between contractual commitments and policy permissions. JURISDICTION FLAGS: EU and UK users face the highest exposure given GDPR and UK GDPR requirements for clear legal bases and purpose limitation. California users may have CCPA rights implicated if AI training constitutes a use of personal information beyond the original collection purpose. Organizations in healthcare, financial services, or legal sectors may face additional regulatory constraints on permitting third-party AI training use of client-related content. CONTRACT AND VENDOR IMPLICATIONS: Procurement teams evaluating Figma as a vendor should confirm that enterprise data processing agreements explicitly restrict AI training use of organizational content, or that the opt-out has been formally exercised and documented. The policy asserts an opt-out right for Professional and Organizational plans, but the mechanism and its operational reliability should be verified during vendor assessment. Liability for unauthorized use of third-party intellectual property submitted to AI features may also warrant review. COMPLIANCE CONSIDERATIONS: Compliance teams should document whether the AI training opt-out has been exercised for all relevant organizational accounts. Data protection impact assessments may be warranted for organizations processing sensitive or regulated content through Figma's AI features. Internal policies governing employee use of Figma AI tools should be reviewed to ensure alignment with organizational data governance obligations.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Design files submitted to Figma's AI features may contain proprietary business information, client work, or sensitive intellectual property, and this clause authorizes Figma to use that material to improve its AI unless users take affirmative steps to opt out.
For individual users on free plans, this clause means their AI-submitted content may be used for AI training without a clear opt-out mechanism. Professional and organizational plan holders have an opt-out right, but it requires active exercise to prevent this use.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Figma.