Bumble uses automated algorithms to decide which other users to show you based on predicted compatibility, and the details of how this works are in the Privacy Policy rather than these Terms.
This analysis describes what Bumble's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Automated profiling and recommender systems that affect which users you see or who sees you involve processing of your personal data for algorithmic decision-making, which has specific rights implications under GDPR and the EU Digital Services Act for users in those jurisdictions.
Interpretive note: The adequacy of the recommender system disclosure depends on the content of the Privacy Policy, which was not provided for review; compliance with DSA and GDPR Article 22 cannot be fully assessed from the Terms alone.
Your profile data and behavior on the app are processed by automated algorithms to determine your compatibility matches, and material information about the parameters used is located in a separate Privacy Policy document rather than in these Terms, requiring users to review multiple documents to understand how they are profiled.
Cross-platform context
See how other platforms handle Recommender System and Matching Algorithms and similar clauses.
Compare across platforms →Monitoring
Bumble has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"We have developed matching algorithms to predict your compatibility with other users and so we can show you people we think are a good match for you. You can learn more about our use of recommender systems and the main parameters we use in our Privacy Policy.— Excerpt from Bumble's Bumble Terms and Conditions
REGULATORY LANDSCAPE: This provision engages GDPR Article 22 on automated individual decision-making and profiling, which grants EU users rights regarding decisions made solely by automated processing. The EU Digital Services Act requires platforms to disclose the main parameters of recommender systems and offer users options to modify those parameters. The FTC has increasing interest in algorithmic decision-making transparency in consumer-facing applications. GOVERNANCE EXPOSURE: Medium. The terms acknowledge the use of recommender systems but defer detailed disclosure to the Privacy Policy, which may satisfy DSA transparency requirements if the Privacy Policy is adequately detailed. However, the adequacy of that disclosure should be independently assessed. For EU users, the right to obtain human review of algorithmic decisions affecting their experience on the platform may require operational implementation. JURISDICTION FLAGS: EU and EEA users have the most robust rights in this area under GDPR Article 22 and the DSA. California users have rights under CCPA regarding the use of sensitive personal information in automated decision-making. UK users are protected under UK GDPR equivalent provisions. CONTRACT AND VENDOR IMPLICATIONS: If third-party AI or data analytics vendors are used to power matching algorithms, data processing agreements should confirm compliance with GDPR processor obligations and DSA recommender system transparency requirements. Algorithmic auditing provisions in vendor contracts should be reviewed. COMPLIANCE CONSIDERATIONS: Legal teams should verify that the Privacy Policy disclosure on recommender system parameters meets DSA requirements for specificity and user control options. GDPR Article 22 compliance documentation should confirm whether the matching algorithm constitutes solely automated decision-making with legal or similarly significant effects, and if so, whether required safeguards are in place.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Automated profiling and recommender systems that affect which users you see or who sees you involve processing of your personal data for algorithmic decision-making, which has specific rights implications under GDPR and the EU Digital Services Act for users in those jurisdictions.
Your profile data and behavior on the app are processed by automated algorithms to determine your compatibility matches, and material information about the parameters used is located in a separate Privacy Policy document rather than in these Terms, requiring users to review multiple documents to understand how they are profiled.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Bumble.