Slack can use your organization's messages, files, and other workspace data to train and improve its artificial intelligence features, unless your workspace administrator turns off this option.
This provision means that messages and files you send in Slack — including sensitive business communications and personal information — may be processed for AI model training, creating privacy and confidentiality risks for both employees and the organizations they work for.
Cross-platform context
See how other platforms handle AI Training Data Use and similar clauses.
Compare across platforms →Confidential business communications, personal employee data, and sensitive files shared in Slack workspaces could be used by Slack to train AI models unless the organization takes affirmative steps to opt out.
REGULATORY FRAMEWORK: This provision implicates GDPR Article 6 (lawful basis for processing), Article 22 (automated decision-making), and Article 28 (processor obligations) — the EU lead supervisory authority is the Irish DPC. It also engages CCPA/CPRA Cal. Civ. Code §1798.140(ag) regarding service provider restrictions on secondary use of personal information, and the emerging EU AI Act (Regulation 2024/1689) Articles 10 and 53 regarding training data governance for general-purpose AI models. The FTC Act Section 5 is engaged where data use exceeds consumer reasonable expectations.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.