If Luma's platform uses third-party AI tools that take actions on your behalf — like interacting with the internet or other systems — Luma takes no responsibility for what those tools do, even if they cause you harm.
Third-party AI tools integrated into Luma can act autonomously on your behalf, and Luma accepts zero responsibility for the outcomes — any harm, financial loss, or unintended action by these tools is entirely your risk.
Cross-platform context
See how other platforms handle Third-Party AI Tool Liability Disclaimer and similar clauses.
Compare across platforms →Luma's platform can authorize AI agents to take autonomous actions on your behalf, but Luma disclaims all liability for those actions — leaving you solely responsible for any consequences.
(1) REGULATORY FRAMEWORK: Autonomous AI agent actions implicate FTC Act Section 5 (unfair/deceptive practices if consumers are not adequately informed of autonomous action risks), the EU AI Act Title IV (transparency requirements for AI systems interacting with natural persons), and potentially the EU Product Liability Directive (revised 2024, extending liability to AI systems). GDPR Art. 22 applies where automated decision-making produces legal or similarly significant effects on individuals. ECPA (18 U.S.C. §§ 2510-2522) may be implicated if AI agents access third-party communications or systems on users' behalf. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.