Provision Registry

10357 classified provisions across 277 platforms — browse, filter, and compare.

Every clause classified by type, severity, and platform. Updated as policies change.

Start Professional free trial Track specific clauses across platforms with provision-level alerts.
Meta · Meta Terms of Service
The terms set a minimum age of 13 and require parental consent for minors, but the agreement does not describe technical age verification mechanisms, which may limit practical enforcement and create exposure under COPPA.
CA-P-008673 First tracked May 10, 2026 Last seen May 11, 2026 Compare across platforms →
Pinterest · Pinterest Terms of Service
This provision establishes Pinterest's stated COPPA compliance position; parents or guardians who discover a child under 13 has created an account should contact Pinterest to request account removal and data deletion.
CA-P-010904 First tracked May 12, 2026 Last seen May 12, 2026 Compare across platforms →
Bumble · Bumble Terms and Conditions
The presence of active monitoring for underage users is a meaningful safety protection, though users should be aware that the method of detection is not fully specified.
CA-P-001188 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Bumble · Bumble Terms and Conditions
The age restriction and active monitoring for underage use engage COPPA and equivalent regulations, and the verification obligation creates data processing implications for users asked to confirm their age.
CA-P-007552 First tracked May 9, 2026 Last seen May 12, 2026 Compare across platforms →
Telegram · Telegram Terms of Service
The minimum age of 18 in these jurisdictions is higher than the GDPR's default of 16 (or 13 in some member states), meaning Telegram applies a stricter standard, but enforcement depends entirely on user self-declaration.
CA-P-002902 First tracked Apr 18, 2026 Last seen Apr 18, 2026 Compare across platforms →
Telegram · Telegram Terms of Service
This age restriction creates a clear eligibility requirement for users in named jurisdictions and may have implications for how Telegram handles age verification and data of minors who may have signed up before this rule was in place.
CA-P-007998 First tracked May 10, 2026 Last seen May 12, 2026 Compare across platforms →
medium Age restriction
OpenAI · OpenAI Terms of Use
This provision establishes the minimum legal age for using OpenAI's AI products and creates parental consent obligations for minors, which affects families and any platform deploying OpenAI services to young users.
CA-P-003150 First tracked Apr 27, 2026 Last seen Apr 27, 2026 Compare across platforms →
medium Acceptable use
Hugging Face · Hugging Face Terms of Service
The age 13 minimum triggers COPPA compliance obligations for Hugging Face, but does not protect teenagers aged 13-17 under GDPR Article 8 (which sets the age of digital consent at 16 in many EU member states) or under various state laws that provide additional protections for minors.
CA-P-001633 First tracked Apr 3, 2026 Last seen May 8, 2026 Compare across platforms →
medium Age restriction
Headspace · Headspace Terms and Conditions
Setting the minimum age at 16 rather than 13 reflects GDPR's default age of digital consent for several EU member states and reduces COPPA compliance risk, but it also means the platform has no mechanism for verified parental consent for 13-15 year olds who may access the service.
CA-P-003551 First tracked Apr 27, 2026 Last seen Apr 27, 2026 Compare across platforms →
medium Account control
Redfin · Redfin Terms of Use
This provision directly triggers COPPA compliance obligations, meaning Redfin must have technical and policy measures in place to prevent children under 13 from accessing the platform and creating accounts.
CA-P-001249 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Riot Games · Riot Games Terms of Service
This provision is significant for parents because it establishes that a parent or guardian must agree to the terms for minors to use the services, and it triggers obligations under children's privacy laws including COPPA and EU equivalents.
CA-P-007529 First tracked May 9, 2026 Last seen May 12, 2026 Compare across platforms →
Google Gemini · Gemini Apps Privacy Notice
While minors have stronger protections, the prohibition on under-13 use may not be effectively enforced, creating legal exposure for Google and risk for children who access the service without adequate age verification.
CA-P-003714 First tracked Apr 28, 2026 Last seen Apr 28, 2026 Compare across platforms →
medium Age restriction
Udemy · Udemy Terms of Use
Udemy's age verification relies on self-declaration, which may be insufficient to prevent children from accessing the platform and exposing their personal data without adequate parental consent.
CA-P-005450 First tracked May 7, 2026 Last seen May 7, 2026 Compare across platforms →
medium Acceptable use
Amazon · Amazon Conditions of Use
Parents and guardians bear legal responsibility for any activity conducted by minors on Amazon accounts, including purchases and content submissions, and should actively supervise use.
CA-P-000237 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
medium Privacy rights
Threads · Threads Privacy Policy
Parents should be aware that Threads relies primarily on user self-reporting for age verification, and that Meta's infrastructure connects Threads to Instagram, which has its own age-related data practices.
CA-P-008590 First tracked May 10, 2026 Last seen May 11, 2026 Compare across platforms →
medium Privacy rights
Canva · Canva Terms of Use
The agreement places responsibility on users to self-certify their age and on parents to supervise minor users, rather than implementing verified age-gating mechanisms; this structure may not satisfy COPPA's verifiable parental consent requirements if Canva knowingly collects personal information from children under 13.
CA-P-010809 First tracked May 11, 2026 Last seen May 12, 2026 Compare across platforms →
Ideogram · Ideogram Terms of Service
If a minor uses the platform and generates or shares content, their account may be terminated and data deleted when discovered, which could result in loss of access and generated content.
CA-P-010045 First tracked May 11, 2026 Last seen May 12, 2026 Compare across platforms →
Chegg · Chegg Terms of Use
This provision is important for parents, as it indicates that Chegg's primary services are targeted at users 13 and older, and that parental consent obligations exist for certain uses by minors, though the verification mechanism is not detailed in the terms.
CA-P-008393 First tracked May 10, 2026 Last seen May 12, 2026 Compare across platforms →
xAI · xAI Terms of Service
The combination of a 13-year minimum age and the disclosure that the service can generate adult-oriented content creates a notable risk for minor users, particularly if parental consent and monitoring are not actively implemented.
CA-P-009784 First tracked May 10, 2026 Last seen May 11, 2026 Compare across platforms →
Replit · Replit Terms of Service
Parents should be aware that while Replit prohibits under-13 users, teens aged 13-17 can use the platform with parental consent, meaning parents may need to actively monitor their children's use of an AI-powered code deployment platform.
CA-P-004280 First tracked Apr 30, 2026 Last seen Apr 30, 2026 Compare across platforms →
Replit · Replit Terms of Service
The agreement restricts platform access for minors and places legal agreement responsibility on parents or guardians for users aged 13-17, which has implications for school and educational use of the platform.
CA-P-011164 First tracked May 12, 2026 Last seen May 12, 2026 Compare across platforms →
Chegg · Chegg Terms of Use
Parents should know that minors using Chegg are subject to the same terms, including arbitration waivers and data collection practices, and parental consent is required for users under 18.
CA-P-001818 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Peacock · Peacock Terms of Use
This provision is designed to comply with COPPA, which restricts the collection of personal information from children under 13 without verifiable parental consent. Parents who discover a child has an account should contact Peacock to have it removed.
CA-P-007508 First tracked May 9, 2026 Last seen May 12, 2026 Compare across platforms →
Discord · Discord Terms of Service
If a minor under 13 uses Discord or a teenager uses it without parental consent, the user is in breach of the terms and Discord can terminate the account; parents should be aware that Discord is not designed or permitted for children under 13.
CA-P-007725 First tracked May 9, 2026 Last seen May 11, 2026 Compare across platforms →
medium Indemnification
Google · Google Terms of Service
This provision governs whether minors can use Google services and places responsibility on parents or guardians to authorize use, which has data privacy implications under laws like COPPA for users under 13 in the US.
CA-P-001895 First tracked Apr 4, 2026 Last seen May 11, 2026 Compare across platforms →
Spotify · Spotify Terms and Conditions
Parents who enable family plan access for minor children are taking on legal responsibility for those children's use of the service, including binding them to these Terms and the mandatory arbitration clause.
CA-P-000320 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Nintendo · Nintendo Terms of Use
Parents are legally responsible for their children's use of Nintendo's services and are bound by the terms they accept on their child's behalf, including the arbitration clause.
CA-P-000992 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
Pika · Pika Terms of Service
The 13-year minimum age threshold means the platform may be used by teenagers, and parental consent obligations are stated but rely on user self-reporting with no described verification mechanism, which creates compliance exposure under COPPA.
CA-P-007565 First tracked May 9, 2026 Last seen May 12, 2026 Compare across platforms →
medium Account control
Microsoft · Microsoft Services Agreement (Legacy)
Parents should know that children who use Microsoft services without proper consent may have their accounts closed, and Microsoft collects data on all users including minors.
CA-P-000014 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →
medium Data collection
Riot Games · Riot Games Terms of Service
Parents are legally responsible for monitoring their children's use of Riot Games, including any purchases made, and Riot may collect data on minors subject to COPPA and similar regulations.
CA-P-001557 First tracked Apr 3, 2026 Last seen Apr 17, 2026 Compare across platforms →

Professional Governance Intelligence

Monitor specific governance provisions across platforms.

Professional includes provision-level monitoring, regulatory mapping, and audit-ready analysis.

Start free Start Professional free trial