Children under 13 cannot use Microsoft services including Copilot, and teenagers between 13 and 18 need parental permission.
Parents should be aware that Microsoft's AI services like Copilot are not designed for children under 13, and that teens require parental approval — but the document does not detail what technical age-verification mechanisms are in place to enforce these restrictions.
Cross-platform context
See how other platforms handle COPPA and Age Restrictions for Minors and similar clauses.
Compare across platforms →Copilot is an AI system capable of generating any type of content, and the adequacy of age verification and parental consent mechanisms directly affects child safety and COPPA compliance.
REGULATORY FRAMEWORK: This provision directly implicates COPPA (Children's Online Privacy Protection Act, 15 U.S.C. §6501 et seq.; 16 CFR Part 312), enforced by the FTC, which requires verifiable parental consent before collecting personal information from children under 13. It also engages the EU's GDPR Art. 8 (age of digital consent varies by member state, 13–16 years), UK's Children's Code (ICO Age Appropriate Design Code, effective 2021), and the EU Digital Services Act Art. 28 (prohibition on processing minors' data for targeted advertising). For Copilot specifically, the EU AI Act's risk classification framework may impose additional obligations where AI systems are accessible to minors.
Compliance intelligence locked
Regulatory citations, enforcement risk, and due diligence action items.
Watcher: regulatory citations. Professional: full compliance memo.