Microsoft Copilot · Microsoft Copilot Terms of Service

Code of Conduct and Content Moderation

Medium severity
Share 𝕏 Share in Share 🔒 PDF

What it is

Microsoft can suspend or terminate your account if you violate its Code of Conduct, which includes a broad range of prohibited behaviors from illegal activity to vaguely defined 'inappropriate content.'

Consumer impact (what this means for users)

Microsoft can terminate your access to all linked services — including Copilot, Xbox, OneDrive, and Outlook — for Code of Conduct violations, with the definition of prohibited conduct broad enough to encompass subjectively 'inappropriate' content without clear standards.

Cross-platform context

See how other platforms handle Code of Conduct and Content Moderation and similar clauses.

Compare across platforms →
Need full compliance memos? See Professional →

Why it matters (compliance & risk perspective)

The broad and vaguely defined content standards — particularly 'inappropriate content or material' — give Microsoft wide discretion to suspend or terminate accounts without clear notice of what conduct is prohibited.

View original clause language
When using the services, you must follow these rules: Don't do anything illegal. Don't engage in any activity that exploits, harms, or threatens to harm children. Don't send spam or engage in phishing. Don't publicly display or use the services to share inappropriate content or material. Don't engage in activity that is harmful to you, the services, or others. Don't infringe upon the rights of others. Don't reverse engineer or attempt to reverse engineer the services.

Institutional analysis (Compliance & legal intelligence)

REGULATORY FRAMEWORK: Content moderation terms implicate Section 230 of the Communications Decency Act (47 U.S.C. §230) which provides Microsoft immunity for third-party content moderation decisions in the US. In the EU, the Digital Services Act (DSA, Regulation 2022/2065) Arts. 14–18 impose transparency and appeal obligations for content removal decisions affecting users. The EU's General Data Protection Regulation may also apply if content moderation involves processing personal data. COPPA implications arise where content moderation applies to minors' accounts. UK Online Safety Act 2023 imposes content moderation obligations for services accessible to UK users.

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Applicable agencies

  • FTC
    The FTC has authority to scrutinize content moderation practices that constitute unfair or deceptive trade practices under Section 5 of the FTC Act, including opaque account termination procedures.
    File a complaint →

Provision details

Document information
Document
Microsoft Copilot Terms of Service
Entity
Microsoft Copilot
Document last updated
April 29, 2026
Tracking information
First tracked
April 27, 2026
Last verified
April 27, 2026
Record ID
CA-P-003187
Document ID
CA-D-00017
Evidence Provenance
Source URL
Wayback Machine
SHA-256
3b836ca98040eca1ec3cd4dd56364c9cc3085ac3f2dd8aea54de71e50c847a66
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: Microsoft Copilot | Document: Microsoft Copilot Terms of Service | Record: CA-P-003187
Captured: 2026-04-27 09:50:27 UTC | SHA-256: 3b836ca98040eca1…
URL: https://conductatlas.com/platform/microsoft-copilot/microsoft-copilot-terms-of-service/code-of-conduct-and-content-moderation/
Accessed: May 2, 2026
Classification
Severity
Medium
Categories

Other provisions in this document