You cannot use Replicate's AI outputs to make fully automated legal decisions about people (like loan or hiring decisions), and you cannot scrape or harvest personal data from Replicate's outputs.
If you're using Replicate to build AI applications, you are prohibited from using the outputs for automated decisions that significantly affect people's lives — violations of this clause expose you (not Replicate) to regulatory enforcement and legal liability.
Cross-platform context
See how other platforms handle Prohibited AI Uses — Automated Decision-Making and Data Scraping and similar clauses.
Compare across platforms →These prohibitions align with EU AI Act and GDPR restrictions on high-risk automated decision-making — violating them could expose you to significant regulatory penalties, and Replicate places the full compliance burden on you as the Customer.
(1) REGULATORY FRAMEWORK: This provision directly mirrors GDPR Art. 22 (automated individual decision-making, including profiling) and EU AI Act Article 6 (high-risk AI system classification) and Article 5 (prohibited AI practices). It also engages CCPA §1798.185 (automated decision-making regulations under CPRA) and FTC Act Section 5 as applied to algorithmic decision-making in consumer contexts. EEOC guidance on AI-based employment decisions (May 2023) and CFPB Circular 2022-03 on adverse action and AI are also implicated for specific use cases. Enforcement authorities include the EU Commission/national DPAs, FTC, CFPB, and EEOC. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.