Provision Registry

3422 classified provisions across 277 platforms — browse, filter, and compare.

Every clause classified by type, severity, and platform. Updated as policies change.

Start Professional free trial Track specific clauses across platforms with provision-level alerts.
Filtering: High × Clear all
Steam · Steam Subscriber Agreement
The minimum age of 13 reflects US COPPA requirements, but many jurisdictions (including the EU under GDPR and the UK under the Age Appropriate Design Code) require a higher age of digital consent — EU users must be at least 16 (or lower with parental consent) and UK users at least 13 with additional safeguards.
CA-P-006080 First tracked May 8, 2026 Last seen May 8, 2026 Compare across platforms →
Discord · Discord Terms of Service
The agreement establishes a minimum age of 13 and requires parental consent for users aged 13 to 17, engaging COPPA compliance obligations for US users and similar frameworks in other jurisdictions.
CA-P-011343 First tracked May 12, 2026 Last seen May 12, 2026 Compare across platforms →
Zoom · Zoom Terms of Service
This restriction is important for parents and educators to understand — Zoom's standard consumer service is not designed or approved for children under 16, which has implications for school and family use.
CA-P-006360 First tracked May 8, 2026 Last seen May 8, 2026 Compare across platforms →
Twitch · Twitch Terms of Service
This age restriction is legally required under COPPA, but the enforcement mechanism relies on user self-certification rather than verified age checks, which may leave children inadequately protected.
CA-P-002791 First tracked Apr 18, 2026 Last seen Apr 18, 2026 Compare across platforms →
Peloton · Peloton Terms of Service
Without active age verification, the COPPA prohibition on collecting data from under-13 users depends entirely on self-reporting, which is a commonly exploited gap that has resulted in FTC enforcement actions against fitness and social platforms.
CA-P-003559 First tracked Apr 27, 2026 Last seen Apr 27, 2026 Compare across platforms →
ClickUp · ClickUp Terms of Use
COPPA creates strict legal obligations around children's data, and any organization deploying ClickUp in educational or youth-facing contexts must ensure underage users are not accessing the platform without proper controls.
CA-P-005960 First tracked May 8, 2026 Last seen May 8, 2026 Compare across platforms →
Reddit · Reddit User Agreement
If a child under 13 uses Reddit, the platform is not legally authorized to collect their data under COPPA, but the ToS places enforcement burden on users and parents rather than Reddit implementing robust age verification.
CA-P-000719 First tracked Apr 3, 2026 Last seen Apr 10, 2026 Compare across platforms →
high Age restriction
Kick · Kick Terms of Service
Kick's age verification relies entirely on self-attestation, meaning minors can access the platform and its potentially mature content without any meaningful gatekeeping, creating both legal and child safety risk.
CA-P-006704 First tracked May 8, 2026 Last seen May 8, 2026 Compare across platforms →
high Age restriction
Stability AI · Stability AI Terms of Service
Given that Stability AI's tools can generate adult or potentially harmful content, the 18+ age restriction is a legal safety measure — but if minors access the service, both legal liability and child safety risks arise.
CA-P-003722 First tracked Apr 28, 2026 Last seen Apr 28, 2026 Compare across platforms →
high Age restriction
Grindr · Grindr Terms of Service
Given the sensitive sexual nature of content on Grindr, the presence of underage users creates severe safety risks, and the absence of robust age verification beyond self-declaration creates legal and reputational exposure.
CA-P-004773 First tracked May 7, 2026 Last seen May 7, 2026 Compare across platforms →
high Privacy rights
TikTok · TikTok Community Guidelines
Minors on TikTok are entitled to heightened privacy and safety protections under COPPA and the EU's GDPR/DSA framework, but TikTok has faced repeated regulatory findings that its age verification and data handling for minors are inadequate.
CA-P-001853 First tracked Apr 3, 2026 Last seen Apr 3, 2026 Compare across platforms →
high Acceptable use
Meta · Meta Terms of Service
Meta's age restriction is a legal minimum required by COPPA, but the absence of robust age verification means children under 13 frequently access the platform — creating significant regulatory exposure for Meta and safety risks for minors whose data may be collected.
CA-P-001931 First tracked Apr 4, 2026 Last seen Apr 9, 2026 Compare across platforms →
OpenAI · OpenAI EU Terms of Use
GDPR Article 8 sets the digital consent age at 16 by default, though member states may lower it to a minimum of 13; users below the applicable threshold require verifiable parental or guardian consent, and platforms must take reasonable steps to verify age.
CA-P-011050 First tracked May 12, 2026 Last seen May 12, 2026 Compare across platforms →
Meta · Meta Terms of Service
Despite the 13-year minimum age requirement, Meta has faced extensive regulatory scrutiny for collecting and profiling data about minors, and the Terms place significant responsibility on parents rather than on Meta's enforcement mechanisms.
CA-P-000179 First tracked Apr 3, 2026 Last seen Apr 3, 2026 Compare across platforms →
BeReal · BeReal Terms of Service
Age restrictions determine whether minors can legally use the service and what additional protections apply to their data, which is particularly significant given BeReal's popularity among teenagers.
CA-P-008451 First tracked May 10, 2026 Last seen May 12, 2026 Compare across platforms →
Suno · Suno Terms of Service
The 13-year minimum age threshold triggers COPPA compliance obligations, but the Terms rely on self-certification rather than verified parental consent, which is a known enforcement risk area.
CA-P-004422 First tracked Apr 30, 2026 Last seen Apr 30, 2026 Compare across platforms →
OpenAI · OpenAI Terms of Use
If a child under 13 uses ChatGPT without authorization, their data may be collected in violation of COPPA, creating legal exposure for both OpenAI and any platform operator that facilitated access; parents should verify their children are not using these services unsupervised.
CA-P-001997 First tracked Apr 4, 2026 Last seen Apr 9, 2026 Compare across platforms →
Discord · Discord Terms of Service
Discord does not describe any technical age verification mechanism in its Terms, which means the age restriction relies on self-reporting, creating risk for minors who may access the platform without parental knowledge or consent.
CA-P-004886 First tracked May 7, 2026 Last seen May 7, 2026 Compare across platforms →
OpenAI · Terms of Use (ROW)
OpenAI's age restriction mechanism relies on user self-attestation without robust verification, which creates COPPA compliance risk and may expose minors to AI-generated content without adequate parental oversight.
CA-P-002444 First tracked Apr 9, 2026 Last seen Apr 10, 2026 Compare across platforms →
Duolingo · Duolingo Terms of Service
Duolingo's platform is widely used by minors, including through Duolingo for Schools; the adequacy of parental consent mechanisms and age verification is a significant compliance consideration under COPPA and equivalent laws.
CA-P-009541 First tracked May 10, 2026 Last seen May 12, 2026 Compare across platforms →
Runway · Runway Terms of Service
Runway collects and processes data from users as young as 13 under COPPA, and the irrevocable AI training data license applies to minors' Inputs and Outputs — raising significant child privacy concerns.
CA-P-004090 First tracked Apr 30, 2026 Last seen Apr 30, 2026 Compare across platforms →
Snapchat · Snap Terms of Service
Snapchat's age verification relies on self-reported age rather than technical verification, which means children under 13 may access the platform without parental knowledge, and COPPA protections may not be effectively enforced.
CA-P-000738 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Google · Google Terms of Service
Google's services collect significant amounts of personal data, and COPPA requires special protections for children under 13 — if a child uses Google without proper parental consent, both the child and the parent may unknowingly expose sensitive data without the legal protections that should apply.
CA-P-003173 First tracked Apr 27, 2026 Last seen Apr 27, 2026 Compare across platforms →
Microsoft Copilot · Microsoft Copilot Terms of Service
This age restriction is legally significant under COPPA and affects how families set up Microsoft accounts for children — if a child under 13 uses the services without proper parental consent setup, Microsoft is not liable for any resulting data collection or harms.
CA-P-002082 First tracked Apr 4, 2026 Last seen Apr 9, 2026 Compare across platforms →
Spotify · Spotify Terms and Conditions
Spotify's self-attestation model for age and parental consent creates compliance risk under COPPA for users aged 13–15, as the law requires verifiable parental consent for children under 13 and has specific obligations regarding data collected from minors. The platform's ability to verify these representations is limited.
CA-P-002602 First tracked Apr 9, 2026 Last seen Apr 10, 2026 Compare across platforms →
Google · Google Terms of Service
If a child under 13 uses Google services without parental consent, Google may collect data from them in violation of COPPA, creating legal risk for Google and leaving children's data without the protections the law requires.
CA-P-000127 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Strava · Strava Terms of Service
Parents who allow underage children to use Strava are accepting full legal responsibility for all Terms violations by those children, including any data privacy implications of minors' GPS and fitness data being processed.
CA-P-006395 First tracked May 8, 2026 Last seen May 8, 2026 Compare across platforms →
high Acceptable use
Grindr · Grindr Terms of Service
Grindr relies on self-attestation rather than independent age verification, meaning minors may access the platform despite the prohibition — a safety risk given the platform's adult content.
CA-P-001403 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
23andMe · 23andMe Terms of Service
Genetic testing of minors raises significant ethical and privacy concerns, as DNA results are permanent and irrevocable, and children cannot meaningfully consent to having their genetic information collected and potentially used for research.
CA-P-000895 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Snapchat · Snap Terms of Service
COPPA requires verifiable parental consent before collecting personal data from children under 13; Snap's self-attestation age verification model has been the subject of prior regulatory scrutiny and may not constitute adequate compliance.
CA-P-003987 First tracked Apr 28, 2026 Last seen Apr 28, 2026 Compare across platforms →

Professional Governance Intelligence

Monitor specific governance provisions across platforms.

Professional includes provision-level monitoring, regulatory mapping, and audit-ready analysis.

Start free Start Professional free trial