TikTok · TikTok Community Guidelines

Violent Extremism and Terrorist Content Prohibition

Medium severity
Share 𝕏 Share in Share

What it is

TikTok prohibits all content that promotes, glorifies, or facilitates terrorism, mass violence, or violent extremism, including content from designated terrorist organizations — such content is removed immediately and may be reported to law enforcement.

Why it matters

This provision reflects both TikTok's legal obligations and its community safety commitments, but the breadth of 'glorification' language creates risk of over-removal of journalistic, educational, or counter-extremism content.

Institutional analysis (Compliance & legal intelligence)

REGULATORY FRAMEWORK: EU Regulation 2021/784 (Terrorist Content Online Regulation, TCOR) requires hosting services to remove terrorist content within one hour of a removal order from a competent authority, with penalties up to 4% of global annual turnover for systemic non-compliance. The DSA (Art. 34) requires VLOP risk assessments for terrorist and violent extremist content. In the US, 18 U.S.C. § 2339B prohibits material support for designated terrorist organizations; platforms may face civil liability under the Anti-Terrorism Act (18 U.S.C. § 2333) if they knowingly provide substantial assistance to terrorist activity. The UK Terrorism Act 2000 and Counter-Terrorism and Security Act 2015 impose additional obligations.

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Consumer impact

TikTok's Community Guidelines grant the platform broad, largely discretionary authority to remove content and suspend or permanently ban accounts for violations ranging from explicit harms like child exploitation to broadly defined categories like 'misinformation' and 'harmful or dangerous acts,' which may affect creators and ordinary users alike. Users under 16 face additional content restrictions and feature limitations, and users under 13 are subject to a separate, more restrictive experience under COPPA compliance obligations. You can appeal content removals and account actions directly within the TikTok app by navigating to Settings, then Support, then Report a Problem.

Applicable agencies

  • FTC
    The FTC has consumer protection jurisdiction over deceptive platform practices, and platform over-removal of legitimate content could constitute an unfair practice under Section 5.
    File a complaint →

Provision details

Document information
Document
TikTok Community Guidelines
Entity
TikTok
Document last updated
March 24, 2026
Tracking information
First tracked
March 6, 2026
Last verified
March 31, 2026
Record ID
CA-P-00034005
Document ID
CA-D-00034
Evidence Provenance
Source URL
Wayback Machine
SHA-256
ed0892da5124c51862507e249c93b111e6234660e7333c44db1f8171e83cd1a2
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: TikTok | Document: TikTok Community Guidelines | Record: CA-P-00034005
Captured: 2026-03-06 20:04:33 UTC | SHA-256: ed0892da5124c518…
URL: https://conductatlas.com/platform/tiktok/tiktok-community-guidelines/violent-extremism-and-terrorist-content-prohibition/
Accessed: April 4, 2026
Classification
Severity
Medium
Categories

Other provisions in this document