Le Chat's Memory feature stores details from your conversations to personalise future responses, and if you mention health or other sensitive information, this may be saved as a 'Memory' — requiring your explicit consent for that sensitive data.
If you mention health conditions, medications, or other sensitive details in Le Chat conversations, this information may be stored as a personalised Memory that affects future AI responses, and you must actively manage or delete these Memories in settings to prevent ongoing retention.
Cross-platform context
See how other platforms handle Memory Feature and Sensitive Health Data Storage and similar clauses.
Compare across platforms →Sensitive health information you casually mention in a chat could be stored and referenced in future conversations without you being fully aware, raising significant privacy risks.
(1) REGULATORY FRAMEWORK: This provision directly implicates GDPR Art. 9 (processing of special categories of personal data), which requires explicit consent (Art. 9(2)(a)) or another enumerated condition for processing health data. The policy claims to rely on 'explicit consent' for sensitive data saved as Memories, but the mechanism — derived from what a user types in a chat interface — may not meet the GDPR standard for freely given, specific, informed, and unambiguous explicit consent. French CNIL guidance on health data processing is particularly stringent. UK GDPR Schedule 1 DPA 2018 applies for UK users. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.