If someone sues Stability AI because of something you created or did with their tools, you have to pay Stability AI's legal costs and any damages — even if the issue arose from AI-generated output you didn't intentionally create.
If AI-generated content you produce using Stability AI's tools is challenged in court for copyright infringement or other violations, you personally bear the legal and financial burden — including paying Stability AI's lawyers — not the company that built the AI.
Cross-platform context
See how other platforms handle User Indemnification for IP Infringement and similar clauses.
Compare across platforms →This is a non-standard and significant risk for commercial users: if an AI-generated image you produce is found to infringe a copyright, you — not Stability AI — are responsible for all legal costs and damages.
REGULATORY FRAMEWORK: This provision engages general contract law indemnification principles under English law (governing law clause); US copyright law (17 U.S.C. § 501 et seq.) for US-based users whose AI outputs are challenged; EU Copyright Directive (2019/790) Art. 4 regarding training data exemptions and output liability; and FTC Act Section 5 considerations regarding whether such liability allocation is disclosed with sufficient clarity to be non-deceptive.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.