By using Facebook, you agree to follow Meta's Community Standards — a separate document that defines what content is and isn't allowed — and Meta can take action against your account if it decides you've violated those standards.
Your right to continue using Facebook depends on compliance with Meta's Community Standards — a separately maintained document Meta can update unilaterally — meaning the rules about what you can post can change without those changes being reflected in the main Terms you agreed to.
Cross-platform context
See how other platforms handle Community Standards and Content Moderation and similar clauses.
Compare across platforms →The Terms incorporate Community Standards by reference, meaning the rules governing what you can post and say on Facebook are set out in a separate, frequently updated document that Meta can change at any time, and violations can lead to content removal or account suspension.
REGULATORY FRAMEWORK: EU DSA Arts. 14, 15, 16, and 17 impose obligations on Very Large Online Platforms regarding content moderation transparency, notice-and-action procedures, and statements of reasons for content removal — all of which are implicated by this provision. The First Amendment (US) does not apply to private platform content moderation. EU Terrorism Content Regulation (2021/784) and the Audiovisual Media Services Directive impose specific content moderation obligations. FTC Act Section 5 applies to deceptive representations about content moderation consistency. (2)
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.