TikTok · TikTok Community Guidelines

CSAM and Child Sexual Exploitation Zero-Tolerance Policy

High severity
Share 𝕏 Share in Share

What it is

TikTok has a zero-tolerance policy for child sexual abuse material (CSAM) and any content that sexualizes minors — any such content results in immediate account removal and referral to the National Center for Missing and Exploited Children (NCMEC) and law enforcement.

Why it matters

This provision reflects TikTok's mandatory legal obligations under 18 U.S.C. § 2258A, which requires electronic service providers to report apparent CSAM to NCMEC — failure to comply carries criminal penalties.

Institutional analysis (Compliance & legal intelligence)

REGULATORY FRAMEWORK: 18 U.S.C. § 2258A requires electronic service providers to report apparent CSAM to the NCMEC CyberTipline; failure to report is a federal criminal offense. PROTECT Our Children Act (18 U.S.C. § 2258) and COPPA also apply. The EU Digital Services Act (Art. 36) requires VLOPs to conduct risk assessments for child sexual exploitation and implement mitigation measures. In the UK, the Online Safety Act 2023 (Part 3) imposes proactive duties to detect and remove child sexual exploitation and abuse (CSEA) content.

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Consumer impact

TikTok's Community Guidelines grant the platform broad, largely discretionary authority to remove content and suspend or permanently ban accounts for violations ranging from explicit harms like child exploitation to broadly defined categories like 'misinformation' and 'harmful or dangerous acts,' which may affect creators and ordinary users alike. Users under 16 face additional content restrictions and feature limitations, and users under 13 are subject to a separate, more restrictive experience under COPPA compliance obligations. You can appeal content removals and account actions directly within the TikTok app by navigating to Settings, then Support, then Report a Problem.

Applicable agencies

  • FTC
    The FTC coordinates with DOJ on COPPA and child exploitation enforcement involving digital platforms including TikTok.
    File a complaint →

Provision details

Document information
Document
TikTok Community Guidelines
Entity
TikTok
Document last updated
March 24, 2026
Tracking information
First tracked
March 6, 2026
Last verified
March 31, 2026
Record ID
CA-P-00034003
Document ID
CA-D-00034
Evidence Provenance
Source URL
Wayback Machine
SHA-256
ed0892da5124c51862507e249c93b111e6234660e7333c44db1f8171e83cd1a2
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: TikTok | Document: TikTok Community Guidelines | Record: CA-P-00034003
Captured: 2026-03-06 20:04:33 UTC | SHA-256: ed0892da5124c518…
URL: https://conductatlas.com/platform/tiktok/tiktok-community-guidelines/csam-and-child-sexual-exploitation-zero-tolerance-policy/
Accessed: April 4, 2026
Classification
Severity
High
Categories

Other provisions in this document