TikTok · TikTok Community Guidelines

Dangerous Challenges and Harmful Activities Prohibition

High severity
Share 𝕏 Share in Share

What it is

TikTok prohibits content that shows, promotes, or encourages dangerous activities, challenges, or behaviors that could result in serious injury or death — including so-called 'dangerous challenges' that have been amplified on the platform.

Why it matters

TikTok has faced significant litigation and regulatory scrutiny over its role in amplifying dangerous viral challenges, particularly among minors, and this provision reflects the platform's response to that liability exposure — though critics argue enforcement remains inconsistent.

Institutional analysis (Compliance & legal intelligence)

REGULATORY FRAMEWORK: The EU DSA (Art. 34-35) requires VLOP risk assessments for systemic risks to physical safety, including amplification of dangerous content through recommendation algorithms. The UK Online Safety Act 2023 (s.12) classifies content that encourages self-harm or suicide as priority illegal content requiring proactive removal. In the US, multiple product liability lawsuits have been filed against TikTok alleging the platform's algorithm negligently recommended dangerous challenge content to minors; TikTok's Section 230 immunity in these cases is being actively litigated. COPPA also creates heightened obligations regarding harmful content served to users under 13.

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Consumer impact

TikTok's Community Guidelines grant the platform broad, largely discretionary authority to remove content and suspend or permanently ban accounts for violations ranging from explicit harms like child exploitation to broadly defined categories like 'misinformation' and 'harmful or dangerous acts,' which may affect creators and ordinary users alike. Users under 16 face additional content restrictions and feature limitations, and users under 13 are subject to a separate, more restrictive experience under COPPA compliance obligations. You can appeal content removals and account actions directly within the TikTok app by navigating to Settings, then Support, then Report a Problem.

Applicable agencies

  • FTC
    The FTC has authority over unfair or deceptive practices under Section 5 of the FTC Act, and TikTok's algorithmic amplification of dangerous content to minors has been cited in FTC-related enforcement actions.
    File a complaint →
  • State AG
    Multiple state Attorneys General have filed suits against TikTok specifically alleging the platform's algorithm recommended dangerous and harmful content to minors, creating direct enforcement exposure.
    File a complaint →

Provision details

Document information
Document
TikTok Community Guidelines
Entity
TikTok
Document last updated
March 24, 2026
Tracking information
First tracked
March 6, 2026
Last verified
March 31, 2026
Record ID
CA-P-00034007
Document ID
CA-D-00034
Evidence Provenance
Source URL
Wayback Machine
SHA-256
ed0892da5124c51862507e249c93b111e6234660e7333c44db1f8171e83cd1a2
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: TikTok | Document: TikTok Community Guidelines | Record: CA-P-00034007
Captured: 2026-03-06 20:04:33 UTC | SHA-256: ed0892da5124c518…
URL: https://conductatlas.com/platform/tiktok/tiktok-community-guidelines/dangerous-challenges-and-harmful-activities-prohibition/
Accessed: April 4, 2026
Classification
Severity
High
Categories

Other provisions in this document