YouTube bans content supporting terrorist or criminal organizations and works with other major tech companies through the GIFCT to identify and remove such content across the internet.
Creators or users who produce content discussing extremism, terrorism, or criminal organizations — even in an analytical or journalistic context — face risk of removal based on government designation lists and cross-platform coordination, with limited transparency about the specific criteria applied.
Cross-platform context
See how other platforms handle GIFCT Counter-Terrorism Partnership and similar clauses.
Compare across platforms →YouTube's reliance on government designations and cross-platform GIFCT coordination means content moderation decisions on extremism can be influenced by geopolitical factors and result in coordinated removal across multiple platforms simultaneously.
REGULATORY FRAMEWORK: This provision implicates the EU Terrorist Content Online Regulation (TCO Regulation, EU 2021/784) requiring hosting service providers to remove terrorist content within one hour of a competent authority removal order; EU DSA Article 41 regarding cooperation with law enforcement; US counter-terrorism statutes including 18 U.S.C. §2339B (material support for terrorism); and OFAC sanctions compliance obligations where designated organizations are involved. Primary enforcement authorities: European Commission, national competent authorities under TCO Regulation, US DOJ, and OFAC.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.