Palantir commits that it will not use data uploaded by its enterprise customers to train its AI systems unless the customer explicitly agrees.
For enterprise clients whose employees or operations generate data processed by Palantir, this means their sensitive business or personal data cannot be repurposed to improve Palantir's AI models unless the client contractually agrees — a meaningful protection against covert AI training.
Cross-platform context
See how other platforms handle AI/ML Training Data Prohibition and similar clauses.
Compare across platforms →This is a highly significant commitment given Palantir's AI platform products and the growing concern about AI companies training on client data without disclosure.
REGULATORY FRAMEWORK: This provision implicates GDPR Art. 5(1)(b) (purpose limitation) and Art. 6 (lawful basis for processing), as using customer data for AI training would constitute a new purpose requiring a fresh legal basis. It also engages the EU AI Act (Regulation 2024/1689) provisions on training data transparency and high-risk AI systems. CCPA/CPRA purpose limitation principles apply to California-resident data. In healthcare contexts, HIPAA 45 CFR §164.502 restricts use of PHI for purposes beyond treatment, payment, or operations.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.