Apps built on Claude that offer mental health or crisis support must include real crisis resources, must not try to replace actual mental health professionals, and must tell users to see a licensed provider for clinical decisions.
If you are using a mental health or crisis support app powered by Claude, that app is required to provide real emergency resources and must not position itself as a substitute for professional clinical care — giving you an enforceable baseline of safety protections.
Cross-platform context
See how other platforms handle High-Risk Use Case Requirements — Mental Health and Crisis Support and similar clauses.
Compare across platforms →This provision creates specific safety obligations for a rapidly growing category of AI wellness apps, protecting vulnerable users from over-relying on AI in moments of crisis.
(1) REGULATORY FRAMEWORK: This provision implicates the FTC Act Section 5 (deceptive health claims), FDA digital health guidance on Software as a Medical Device (SaMD, 21 CFR Part 820), HIPAA 45 CFR §§ 164.502-164.514 for operators handling protected health information, state mental health licensure laws, and the EU AI Act Annex III (high-risk AI in healthcare). The 988 Suicide and Crisis Lifeline obligations under the Mental Health and Substance Use Disorder Parity and Addiction Equity Act may also apply to operators deploying crisis-adjacent tools. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.