Microsoft services are not intended for children below the minimum age required by local law, and Microsoft states it will delete personal data collected from underage children if discovered.
This analysis describes what Microsoft Copilot's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Parents and guardians should be aware that children below the applicable age threshold, typically 13 in the US under COPPA, should not be creating or using Microsoft accounts without proper parental consent mechanisms.
Interpretive note: The agreement references compliance with local law without specifying implementation details, creating uncertainty about how age verification or parental consent is operationally enforced across different jurisdictions.
Minors below the applicable age threshold who use Microsoft services including Copilot without proper parental consent may have their personal data collected in a manner that raises legal questions under COPPA and equivalent national laws.
How other platforms handle this
Replit is not directed to children under the age of 13. If you are under 13 years of age, you are not permitted to use the Services. If we learn that we have collected Personal Information from a child under age 13, we will take steps to delete such information from our files as soon as possible.
Our online services are not directed to children under the age of 13, and we do not knowingly collect personal information from children under 13. If we learn that we have collected personal information from a child under 13, we will delete that information as quickly as possible.
Our Services are not directed to children under the age of 13. We do not knowingly collect personal information from children under 13. If we learn that we have collected personal information from a child under 13 without parental consent, we will take steps to delete such information. In some juris...
Monitoring
Microsoft Copilot has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Children and Microsoft Services. Microsoft's services are not directed to children whose age makes it illegal to process their personal data or who otherwise require parental or guardian consent for processing consistent with local law. If we learn that we have collected personal data from a child who is under the required age, we will delete that personal data.— Excerpt from Microsoft Copilot's Microsoft Copilot Terms of Service
(1) REGULATORY LANDSCAPE: COPPA applies to online services directed to children under 13 in the US and requires verifiable parental consent before collecting personal data from such children. The FTC is the primary COPPA enforcement authority. EU GDPR Article 8 sets the age of digital consent at 16, though member states may lower it to 13; services accessible to children in the EU must comply with applicable national age thresholds. The UK's Age Appropriate Design Code (Children's Code) imposes additional obligations for services likely to be accessed by children. (2) GOVERNANCE EXPOSURE: Medium. The agreement's approach of prohibiting underage use and committing to delete unlawfully collected data is a common compliance posture, but it does not detail the technical or procedural safeguards in place to prevent or detect underage access, particularly for AI-powered services like Copilot. (3) JURISDICTION FLAGS: EU member states have varying digital consent ages between 13 and 16. The UK Children's Code applies to services likely to be accessed by under-18s and imposes design and data minimization obligations. Illinois BIPA could interact with biometric data if any covered services collect such data from minors. (4) CONTRACT AND VENDOR IMPLICATIONS: Organizations deploying Microsoft consumer services in educational or family contexts should assess whether the age restriction and parental consent mechanisms are adequate for their specific deployment and user base. (5) COMPLIANCE CONSIDERATIONS: Compliance teams should assess whether Microsoft's stated data deletion commitment is operationally implemented and whether parental consent flows are in place for services like Copilot that may be accessed by younger users. A data protection impact assessment may be warranted for AI services accessible to minors.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Parents and guardians should be aware that children below the applicable age threshold, typically 13 in the US under COPPA, should not be creating or using Microsoft accounts without proper parental consent mechanisms.
Minors below the applicable age threshold who use Microsoft services including Copilot without proper parental consent may have their personal data collected in a manner that raises legal questions under COPPA and equivalent national laws.
ConductAtlas has identified this type of provision across 5 platforms. See the full comparison.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Microsoft Copilot.