Steam
· Steam Subscriber Agreement
The minimum age of 13 reflects US COPPA requirements, but many jurisdictions (including the EU under GDPR and the UK under the Age Appropriate Design Code) require a higher age of digital consent — EU users must be at least 16 (or lower with parental consent) and UK users at least 13 with additional safeguards.
The agreement establishes a minimum age of 13 and requires parental consent for users aged 13 to 17, engaging COPPA compliance obligations for US users and similar frameworks in other jurisdictions.
Zoom
· Zoom Terms of Service
This restriction is important for parents and educators to understand — Zoom's standard consumer service is not designed or approved for children under 16, which has implications for school and family use.
Twitch
· Twitch Terms of Service
This age restriction is legally required under COPPA, but the enforcement mechanism relies on user self-certification rather than verified age checks, which may leave children inadequately protected.
Without active age verification, the COPPA prohibition on collecting data from under-13 users depends entirely on self-reporting, which is a commonly exploited gap that has resulted in FTC enforcement actions against fitness and social platforms.
COPPA creates strict legal obligations around children's data, and any organization deploying ClickUp in educational or youth-facing contexts must ensure underage users are not accessing the platform without proper controls.
Reddit
· Reddit User Agreement
If a child under 13 uses Reddit, the platform is not legally authorized to collect their data under COPPA, but the ToS places enforcement burden on users and parents rather than Reddit implementing robust age verification.
Kick
· Kick Terms of Service
Kick's age verification relies entirely on self-attestation, meaning minors can access the platform and its potentially mature content without any meaningful gatekeeping, creating both legal and child safety risk.
Given that Stability AI's tools can generate adult or potentially harmful content, the 18+ age restriction is a legal safety measure — but if minors access the service, both legal liability and child safety risks arise.
Grindr
· Grindr Terms of Service
Given the sensitive sexual nature of content on Grindr, the presence of underage users creates severe safety risks, and the absence of robust age verification beyond self-declaration creates legal and reputational exposure.
TikTok
· TikTok Community Guidelines
Minors on TikTok are entitled to heightened privacy and safety protections under COPPA and the EU's GDPR/DSA framework, but TikTok has faced repeated regulatory findings that its age verification and data handling for minors are inadequate.
Meta
· Meta Terms of Service
Meta's age restriction is a legal minimum required by COPPA, but the absence of robust age verification means children under 13 frequently access the platform — creating significant regulatory exposure for Meta and safety risks for minors whose data may be collected.
OpenAI
· OpenAI EU Terms of Use
GDPR Article 8 sets the digital consent age at 16 by default, though member states may lower it to a minimum of 13; users below the applicable threshold require verifiable parental or guardian consent, and platforms must take reasonable steps to verify age.
Meta
· Meta Terms of Service
Despite the 13-year minimum age requirement, Meta has faced extensive regulatory scrutiny for collecting and profiling data about minors, and the Terms place significant responsibility on parents rather than on Meta's enforcement mechanisms.
BeReal
· BeReal Terms of Service
Age restrictions determine whether minors can legally use the service and what additional protections apply to their data, which is particularly significant given BeReal's popularity among teenagers.
Suno
· Suno Terms of Service
The 13-year minimum age threshold triggers COPPA compliance obligations, but the Terms rely on self-certification rather than verified parental consent, which is a known enforcement risk area.
If a child under 13 uses ChatGPT without authorization, their data may be collected in violation of COPPA, creating legal exposure for both OpenAI and any platform operator that facilitated access; parents should verify their children are not using these services unsupervised.
Discord does not describe any technical age verification mechanism in its Terms, which means the age restriction relies on self-reporting, creating risk for minors who may access the platform without parental knowledge or consent.
OpenAI's age restriction mechanism relies on user self-attestation without robust verification, which creates COPPA compliance risk and may expose minors to AI-generated content without adequate parental oversight.
Duolingo's platform is widely used by minors, including through Duolingo for Schools; the adequacy of parental consent mechanisms and age verification is a significant compliance consideration under COPPA and equivalent laws.
Runway
· Runway Terms of Service
Runway collects and processes data from users as young as 13 under COPPA, and the irrevocable AI training data license applies to minors' Inputs and Outputs — raising significant child privacy concerns.
Snapchat's age verification relies on self-reported age rather than technical verification, which means children under 13 may access the platform without parental knowledge, and COPPA protections may not be effectively enforced.
Google
· Google Terms of Service
Google's services collect significant amounts of personal data, and COPPA requires special protections for children under 13 — if a child uses Google without proper parental consent, both the child and the parent may unknowingly expose sensitive data without the legal protections that should apply.
This age restriction is legally significant under COPPA and affects how families set up Microsoft accounts for children — if a child under 13 uses the services without proper parental consent setup, Microsoft is not liable for any resulting data collection or harms.
Spotify
· Spotify Terms and Conditions
Spotify's self-attestation model for age and parental consent creates compliance risk under COPPA for users aged 13–15, as the law requires verifiable parental consent for children under 13 and has specific obligations regarding data collected from minors. The platform's ability to verify these representations is limited.
Google
· Google Terms of Service
If a child under 13 uses Google services without parental consent, Google may collect data from them in violation of COPPA, creating legal risk for Google and leaving children's data without the protections the law requires.
Strava
· Strava Terms of Service
Parents who allow underage children to use Strava are accepting full legal responsibility for all Terms violations by those children, including any data privacy implications of minors' GPS and fitness data being processed.
Grindr
· Grindr Terms of Service
Grindr relies on self-attestation rather than independent age verification, meaning minors may access the platform despite the prohibition — a safety risk given the platform's adult content.
Genetic testing of minors raises significant ethical and privacy concerns, as DNA results are permanent and irrevocable, and children cannot meaningfully consent to having their genetic information collected and potentially used for research.
COPPA requires verifiable parental consent before collecting personal data from children under 13; Snap's self-attestation age verification model has been the subject of prior regulatory scrutiny and may not constitute adequate compliance.