TikTok has a zero-tolerance policy for child sexual abuse material (CSAM) and any content that sexualizes minors — any such content results in immediate account removal and referral to the National Center for Missing and Exploited Children (NCMEC) and law enforcement.
This provision reflects TikTok's mandatory legal obligations under 18 U.S.C. § 2258A, which requires electronic service providers to report apparent CSAM to NCMEC — failure to comply carries criminal penalties.
REGULATORY FRAMEWORK: 18 U.S.C. § 2258A requires electronic service providers to report apparent CSAM to the NCMEC CyberTipline; failure to report is a federal criminal offense. PROTECT Our Children Act (18 U.S.C. § 2258) and COPPA also apply. The EU Digital Services Act (Art. 36) requires VLOPs to conduct risk assessments for child sexual exploitation and implement mitigation measures. In the UK, the Online Safety Act 2023 (Part 3) imposes proactive duties to detect and remove child sexual exploitation and abuse (CSEA) content.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.
TikTok's Community Guidelines grant the platform broad, largely discretionary authority to remove content and suspend or permanently ban accounts for violations ranging from explicit harms like child exploitation to broadly defined categories like 'misinformation' and 'harmful or dangerous acts,' which may affect creators and ordinary users alike. Users under 16 face additional content restrictions and feature limitations, and users under 13 are subject to a separate, more restrictive experience under COPPA compliance obligations. You can appeal content removals and account actions directly within the TikTok app by navigating to Settings, then Support, then Report a Problem.