Found in 50 of 170 platforms tracked (29% adoption) · 82 provisions
Genetic testing of minors raises significant ethical and privacy concerns, as DNA results are permanent and irrevocable, and children cannot meaningfully consent to having their genetic information c…
Using Adobe tools to handle sensitive personal data in violation of this clause could expose you or your organization to legal liability and result in account termination.
This clause prevents misuse of AWS's powerful infrastructure for cyberattacks or unauthorized access, protecting the broader internet ecosystem.
The explicit inclusion of 'modifications to evade detection or medical countermeasures' closes a loophole that has existed in some other AI platform policies, making this prohibition among the most c…
Any app or platform using Claude that is designed for or likely to attract minor users must implement additional protections — failure to do so exposes the operator to COPPA liability and potential F…
This provision prohibits some of the most harmful uses of generative AI — including AI-powered influence operations and large-scale disinformation campaigns — and places affirmative obligations on op…
Misleading health claims in apps can lead consumers to forgo professional medical care, creating real patient safety risk — Apple's requirement that health apps disclose limitations and recommend pro…
The platform hosts AI characters that can engage in a wide range of conversations, and access by minors raises significant safety concerns that the Terms attempt to address through age gating.
AI conversational tools present elevated risk for minors, and the adequacy of age verification mechanisms is a key regulatory concern globally; failure to prevent minor access can result in significa…
Grindr relies on self-attestation rather than independent age verification, meaning minors may access the platform despite the prohibition — a safety risk given the platform's adult content.
Despite the 13-year minimum age requirement, Meta has faced extensive regulatory scrutiny for collecting and profiling data about minors, and the Terms place significant responsibility on parents rat…
Despite stated restrictions, research and regulatory investigations have found that minors routinely access Meta's platforms and may be subject to behavioral profiling and targeted advertising, creat…
Placing sanctions compliance obligations on individual users is highly unusual and creates personal legal risk — violating OFAC sanctions can result in civil penalties of up to $1 million per transac…
Knowing what AI uses Microsoft has banned helps consumers understand the boundaries placed on how powerful AI tools can be used against them.
Children under 13 are legally prohibited from using OpenAI services, and teens between 13 and 17 should only use them with parental permission — failure to enforce this creates significant legal risk…
If a minor uses OpenAI's products without appropriate consent mechanisms, their data may be collected without the protections required by law, creating risks for both families and OpenAI.
These prohibitions apply to every user and operator without exception, meaning no business agreement or special permission can authorize these activities — violations will result in enforcement actio…
If a minor uses OpenAI services without proper parental consent, both the minor and the account holder may be in violation of the Terms — and OpenAI may collect or process that minor's data without t…
This prohibition directly addresses the potential for AI to be weaponized in democratic processes, and violations could expose users to both OpenAI enforcement action and potential violations of fede…
These absolute prohibitions represent OpenAI's commitment to a floor of safety behavior that no business customer or individual user can disable — but the system card also acknowledges that adversari…
Generating or attempting to generate CSAM is a federal crime regardless of the medium used, and OpenAI's prohibition combined with a content reporting mechanism creates both a legal compliance layer …
If a child under 13 uses Reddit, the platform is not legally authorized to collect their data under COPPA, but the ToS places enforcement burden on users and parents rather than Reddit implementing r…
Merchants in or adjacent to prohibited categories may find their business model incompatible with Shopify, and risk account termination if they inadvertently sell a prohibited item.
Merchants selling goods that resemble or replicate branded products risk account termination, and may also face separate legal action from rights holders outside of Shopify.
Snapchat's age verification relies on self-reported age rather than technical verification, which means children under 13 may access the platform without parental knowledge, and COPPA protections may…
TikTok's age restriction provisions directly affect the safety and privacy of minors, and parents who allow children to use TikTok under the standard Terms bear legal responsibility for the child's u…
WhatsApp does not have robust age verification mechanisms, and younger users may still access the platform, raising child safety and legal compliance concerns.
This policy affects what users can post and can result in permanent account bans, meaning loss of access to followers, content, and any associated paid features.
Parents and guardians bear legal responsibility for any activity conducted by minors on Amazon accounts, including purchases and content submissions, and should actively supervise use.
This provision protects the broader internet infrastructure and other AWS customers from attacks launched via AWS's powerful network resources.
Businesses using AWS for email marketing or communications must comply with CAN-SPAM and similar laws or risk account termination on top of regulatory penalties.
This provision prohibits deceptive AI identity fraud — a growing consumer protection concern — and is directly relevant to chatbot deployments, customer service automation, and AI companionship produ…
This provision means the rules governing how Anthropic's AI can be used are not uniform — government customers may be permitted to use Claude in ways that would be prohibited for all other users, bas…
Minors are prohibited from using Claude.ai and Claude Pro, and parents or guardians should be aware that the services are not designed or permitted for children.
Without strong age verification, children under 13 may use the platform and have their sensitive data — including photos and location — collected without the heightened protections required by law.
The presence of active monitoring for underage users is a meaningful safety protection, though users should be aware that the method of detection is not fully specified.
Parents should know that minors using Chegg are subject to the same terms, including arbitration waivers and data collection practices, and parental consent is required for users under 18.
This restriction places Epic's intellectual property outside the scope of AI training, and any violation could result in account termination and potential legal liability.
Parents should be aware that minors using Eventbrite require guardian oversight and that certain platform features may be restricted or prohibited for underage users.
Parents and educators should be aware that Figma is not legally available to children under 13 (or 16 in the EU), and any use by minors below these ages violates the ToS and may result in account ter…
Given the nature of the platform, ensuring minors cannot access the service or have their data collected is both a legal obligation and a significant safety issue.
Setting the minimum age at 16 rather than 13 means Headspace goes beyond COPPA's requirements but must still ensure effective age verification, particularly given the sensitive mental health data the…
This provision is designed to protect user safety by barring individuals with certain criminal histories from the platform, but enforcement relies on self-reporting with no stated verification mechan…
This minimum age requirement reflects COPPA obligations in the US, but does not extend enhanced protections to teenagers aged 13-17 who may be exposed to AI-generated content and mature datasets.
This is an unusually explicit prohibition that goes beyond standard copyright restrictions and specifically targets AI use cases — violating it could result in account termination and potential legal…
Children under 16 are prohibited from using LinkedIn, but the platform relies on self-reporting with no robust age verification mechanism, which may expose younger users to data collection and proces…
Sharing your Netflix account with people who don't live with you may violate these Terms and could result in account suspension or termination.
This is a notable and relatively new provision targeting AI developers and researchers who might otherwise attempt to use Netflix content for building or fine-tuning AI models.
This clause is broader than typical content restrictions — it extends to any AI-related activity including prompting, fine-tuning, and benchmarking, which could affect developers, researchers, and en…
Sharing your account with people outside your home — such as family members in other locations — may violate this term and could result in account restrictions or termination.
Violating these restrictions can result in account termination and potential legal action from Nintendo, even for seemingly minor activities like fan-made content.
Parents are legally responsible for all activity on the account, including purchases and content accessed by children, so setting up parental controls is essential to protect minors.
Users traveling internationally or those in certain regions may find significant portions of the content library unavailable, and attempting to use a VPN to access content may result in account suspe…
Patreon hosts adult content on its platform, making age verification a significant safety consideration, particularly for protecting minors from accessing inappropriate material.
Minors who use Poshmark are conducting real financial transactions — buying and selling goods — and parents may be liable for transactions made by their children under their supervision.
This clause is strictly enforced in real estate data contexts and could expose researchers, developers, or competitors to legal claims under the Computer Fraud and Abuse Act if they access Redfin dat…
Merchants in the adult content space may be prohibited from using Shopify entirely or may require specific platform approval, limiting their channel options.
The AUP is a separate document that Shopify can update, and violations — even unintentional ones — can result in your store being shut down. Merchants must actively monitor the AUP for changes.
Consumers are protected from spam and harassment originating from Shopify merchants. Merchants who violate this rule face account suspension.
Merchants selling firearms accessories or related products must carefully verify compliance with applicable federal and state law, as Shopify's prohibition is broad and violations result in account t…
Merchants offering fintech, crypto, or lending products via Shopify may find key business functions are prohibited or require special approval, creating operational and legal uncertainty.
Violations of the AUP — even by individual employees — can trigger account suspension or termination, making organizational oversight of user behavior important.
The full scope of restrictions on your ability to use Square is not contained in this document alone — you must also comply with a separate policy document that can be updated independently, and any …
Parents and guardians should be aware that the service is restricted to adults, and the platform generates AI content that may be entirely unsuitable for minors.
Businesses that expand into new product lines or markets without updating their Stripe account description risk termination and fund withholding, even if the new activity is legal.
You are contractually bound to comply with card network rules that you are not given the full text of and that can change without direct notice to you, creating a hidden compliance obligation that ca…
This rule prevents Clients and Taskers from arranging tasks directly to avoid platform fees, but it also means Taskrabbit can monitor, access, and retain all your task-related communications.
The platform handles sensitive personal data and facilitates connections between adults; the presence of minors creates significant safety and legal risks that the terms attempt to address through pr…
Violations of the acceptable use policy can result in immediate account suspension, and customers are responsible for ensuring all communications sent through their applications comply with these rul…
This restricts third-party tools and browser extensions that some users rely on to aggregate or analyze their financial data across platforms.
Users who download or share market data, screenshots, or platform content for commercial purposes could face legal claims from Webull for intellectual property infringement.
This provision restricts how researchers, journalists, developers, and ordinary users can interact with X's data, and violations could result in legal action or account termination.
Children under 13 are prohibited from using X, and this minimum age threshold triggers specific legal obligations around minors' data under COPPA in the US and GDPR-related children's data protection…
While Yelp restricts use by children under 13, it does not have verified age-gating, meaning children could potentially create accounts, and parents should monitor their children's use.
This provision limits Chase's legal exposure under COPPA and signals that parents should prevent minors from using Chase's online platforms unsupervised.
Using Chegg's content for commercial purposes, sharing answers widely, or reproducing platform content without permission could result in account termination or legal action.
You do not own the software on your Fitbit device — you are only licensed to use it, and that license can be revoked. This limits what you can do with your device and the data generated by it.
Unauthorized downloading or sharing of content can expose users to legal liability under copyright law.
Public is a regulated financial services platform and restricts access to legal adults. If a minor uses the platform under false pretenses, the account may be terminated and transactions may be rever…
Anyone visiting the Salesforce website is bound by these terms, which set rules on what you can and cannot do on the site and limit Salesforce's liability for website content.
This clause protects against minors entering into legally binding service agreements, but places the verification burden entirely on users with no proactive age-gating mechanism described in these Te…
Minors who access the platform have no legal standing under this agreement, and any account or activity by a minor is unauthorized and could be subject to termination.
Create a free account and watch the platforms that matter to you. We'll email you the moment something changes.