The 13-year minimum is the baseline required by US federal law (COPPA), but the platform's reliance on user self-declaration without technical age verification means underage use is possible, which could expose parents to platform-collected data practices they did not consent to.
Yelp
· Yelp Terms of Service
The age restriction reflects Yelp's compliance posture under the Children's Online Privacy Protection Act (COPPA), which restricts data collection from children under 13 without verifiable parental consent.
If a minor under 13 creates an account, their data may have already been collected before DoorDash becomes aware — and the burden to identify and report this falls on parents.
While this clause provides basic COPPA compliance, it places the burden on OpenSea's own discovery rather than requiring active age verification, leaving a potential compliance gap.
Ticketmaster does not independently verify users' ages, meaning minors can access the platform and enter into legally binding financial transactions without parental knowledge, creating COPPA compliance risk.
Medium
· Medium Privacy Policy
Parents and guardians should be aware that Medium does not have mechanisms to verify user age at sign-up, which means the platform relies on users to self-report compliance with the age restriction.
Children under 13 using Grammarly — including for school assignments — are not covered by the service's terms, and any data collected from them could be subject to regulatory action.
Parents should be aware that the platform is not designed for children, and if a minor under 13 has used a parent's account or created their own, their data may have been collected in violation of COPPA.
Figma
· Figma Terms of Service
Parents and educators should be aware that Figma is not legally available to children under 13 (or 16 in the EU), and any use by minors below these ages violates the ToS and may result in account termination and deletion of the minor's data.
The prohibition on under-13 users is legally required by COPPA, but without verified age-gating mechanisms, the restriction may be unenforceable in practice — and parents who permit their minor children to use the service take on full legal responsibility for that use.
The age eligibility requirement is legally significant because investment accounts for minors typically require custodial account structures with specific regulatory requirements, and the terms create a blanket warranty of user eligibility that users must self-certify.
Venmo
· Venmo User Agreement
The agreement establishes that Venmo is not authorized for use by minors, and that users represent their own eligibility; Venmo does not assume responsibility for verifying user age at registration beyond this self-representation.
These age restrictions are legally significant because COPPA imposes specific consent and data protection requirements for users under 13, and the under-18 requirement creates compliance obligations around how Peloton handles accounts where users may be minors.
Peacock collects personal data from account holders, and age gating is required by law to protect minors from data collection and inappropriate content exposure.
The Kids & Teens product involves collecting personal and financial data about minors, which triggers heightened legal protections under federal and state law, including COPPA for children under 13.
LinkedIn does not actively verify user ages, meaning minors could potentially access the service, and the legal protection for their data depends on users being honest about their age at registration.
This provision establishes 18 as the minimum age for account creation while creating a parental consent pathway for minors, which has implications for how genetic data of children is collected and processed under applicable privacy laws.
Cerebras allows users as young as 13 to access AI inference and training services, raising child safety and data protection compliance questions — particularly in jurisdictions with higher digital consent ages.
Parents should be aware that Pinterest is not designed for young children and that minors under 13 are not permitted to register; accounts discovered to belong to underage users may be removed along with their data.
Meta
· Meta Terms of Service
If you are a parent, this clause means Meta is not supposed to allow children under 13 to create accounts, though enforcement of this restriction relies primarily on user self-reporting of age.
The 18-year minimum age requirement, which is higher than the legal consent age for online services in many jurisdictions, affects a significant population of potential learners and creates compliance implications around age verification and parental account use by minors.
The Kids and Teens product extends Revolut's services to minors, creating specific regulatory obligations around data collection and protection for users under 13 under COPPA, and under 18 more broadly.
If a child under 13 uses Poshmark without proper verification, the platform collects their data in potential violation of COPPA, and the parent may have limited recourse to request deletion of that data.
Microsoft
· Microsoft Services Agreement (Legacy)
This provision establishes Microsoft's stated compliance with COPPA's minimum age threshold by prohibiting account creation for users under 13, but the enforcement mechanism and verification process are not detailed in this provision.
Minors aged 16-17 can open Revolut accounts with restrictions, which means additional safeguards apply to protect younger users from financial harm — but these users and their parents should understand the specific limitations before opening an account.
The platform relies on user self-representation for age verification, which creates risk for both minors who access the platform and for Tinder's compliance with laws protecting children online.
Minors are prohibited from using Claude.ai and Claude Pro, and parents or guardians should be aware that the services are not designed or permitted for children.
Kick
· Kick Terms of Service
The age restriction and parental consent requirement for minors engage COPPA in the US context and similar frameworks in the EU and UK, and parents should be aware that minors using the platform with their consent are subject to the platform's full data collection and content exposure practices.
OpenAI
· OpenAI Business Terms
The agreement places responsibility on parents or legal guardians for minors' use of the services, which has direct implications for parental liability and the attribution of contractual obligations arising from a minor's account activity.
Given Whatnot's focus on collectibles, trading cards, and pop culture items that strongly appeal to teenagers and younger audiences, there is a meaningful risk of underage users accessing the platform, creating compliance exposure.