X can remove your content, restrict its visibility, suspend your account, or take legal action if you violate the terms; in the EU and UK, X must also restrict content that is 'harmful' or 'unsafe' under local law, even if it doesn't violate X's own rules.
This analysis describes what X's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
The terms reserve broad enforcement discretion including legal action, and acknowledge that EU and UK regulatory obligations require X to restrict categories of content beyond its own policy violations, which may result in content or account restrictions for users in those jurisdictions regardless of X's own policies.
Users whose content is restricted or removed in the EU or UK may face restrictions required by local law (DSA, Online Safety Act) rather than X's own rules, with redress mechanisms available via X's internal complaints process or out-of-court dispute settlement. US and non-EU users have enforcement actions taken at X's discretion without the equivalent statutory redress framework.
How other platforms handle this
While the categories of Restricted Content above provide a clear framework, we may also moderate other types of Content in response to evolving challenges posed by advancements in Machine Learning. As we assess such Content, we hold consent as a core value, ensuring our approach remains thoughtful, ...
Mistral AI may monitor use of the Mistral AI Products through automated means in accordance with the Usage Policy. This monitoring is conducted to ensure compliance with Mistral AI's terms and policies, and to maintain the security and integrity of Mistral AI Products. We reserve the right to review...
Subject to these Terms, Ideogram hereby assigns to you all right, title, and interest in Outputs generated by you using the Services. To the extent Outputs include or are based on another user's Content (e.g., where you and another user enter similar or identical prompts), you acknowledge that Ideog...
Monitoring
X has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"X reserves the right to take enforcement actions against you if you do violate these terms, such as, for example, removing your Content, limiting visibility, discontinuing your access to X, or taking legal action. Certain jurisdictions, including the European Union and the United Kingdom, also impose obligations on X to enforce against not only illegal content but also categories of content deemed by law to be 'harmful' or 'unsafe.' As a result, your Content or account may be subject to restrictions in those jurisdictions.— Excerpt from X's X Terms of Service
REGULATORY LANDSCAPE: This provision engages the EU Digital Services Act (Regulation (EU) 2022/2065), particularly Articles governing content moderation obligations for very large online platforms, the UK Online Safety Act 2023, and X's obligations as a designated VLOP (Very Large Online Platform) under the DSA. The European Commission and Ofcom are the relevant enforcement authorities respectively. GOVERNANCE EXPOSURE: High for EU/UK operations. X's acknowledgment of DSA and OSA obligations confirms platform obligations to enforce against 'harmful' or 'unsafe' content categories as defined by law, with associated transparency, due process, and redress requirements. Non-compliance with DSA content moderation obligations can result in fines of up to 6% of global annual turnover. JURISDICTION FLAGS: EU and UK users have explicit statutory redress rights referenced in the document (DSA out-of-court dispute settlement, OSA complaints process). US users do not have equivalent statutory redress mechanisms referenced in these terms. The geographic scope of 'harmful content' obligations varies by jurisdiction and is subject to evolving regulatory guidance. CONTRACT AND VENDOR IMPLICATIONS: Organizations that rely on X for communications or content distribution in the EU/UK should be aware that content may be restricted under DSA/OSA obligations independent of X's own content policies. Brand safety assessments should account for regulatory content moderation requirements in these jurisdictions. COMPLIANCE CONSIDERATIONS: Legal and compliance teams should review X's DSA transparency reports and complaint statistics to assess enforcement patterns. EU-based organizations with content moderation programs that intersect with X should evaluate DSA Article 17 statement of reasons requirements and appeals procedures.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
The terms reserve broad enforcement discretion including legal action, and acknowledge that EU and UK regulatory obligations require X to restrict categories of content beyond its own policy violations, which may result in content or account restrictions for users in those jurisdictions regardless of X's own policies.
Users whose content is restricted or removed in the EU or UK may face restrictions required by local law (DSA, Online Safety Act) rather than X's own rules, with redress mechanisms available via X's internal complaints process or out-of-court dispute settlement. US and non-EU users have enforcement actions taken at X's discretion without the equivalent statutory redress framework.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by X.