Cash App states it may use your personal data including transaction history, behavioral data, and profile information to train AI and machine learning models and to draw inferences that build a profile about your credit risk, preferences, and shopping habits.
This analysis describes what Cash App's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
The authorization to use personal data for AI training is explicit and broad, and the notice does not describe limits on which data categories may be used for this purpose or how long AI-trained models derived from user data are retained.
Interpretive note: The notice does not specify which data categories are included in or excluded from AI training, and the scope of profiling opt-out rights available to non-California users depends on the applicable state law framework.
The updated policy establishes that children under 13 may use Cash App services if a parent or guardian signs up for or authorizes the account on their behalf. Previously, the policy explicitly prohi…
The revised policy shifts from prohibiting all children under 13 from using Cash App to permitting use when a parent or guardian explicitly authorizes or signs up for the service on the child's behal…
The policy states that data including transaction history, behavioral data, and inferred characteristics may be used to train AI models and build profiles reflecting credit risk, preferences, and shopping habits; California residents have a right to opt out of profiling under the CPRA, and users in other states with profiling opt-out rights under applicable state laws may also be entitled to limit this use.
How other platforms handle this
We use your personal data to develop, train, and improve our artificial intelligence and machine learning models. This includes using your transaction data, behavioral data, and interaction data to enhance our fraud detection, credit assessment, and personalization capabilities. We take steps to pro...
We use Personal Data to detect and prevent fraud, and to develop and improve our fraud detection models and other machine learning systems. This may include using transaction data, device information, and other Personal Data to train and refine our systems.
Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.
Monitoring
Cash App has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"Improving, personalizing and facilitating your use of our Services, content and applications, including by training artificial intelligence (AI) and other machine learning models; Drawing inferences from any of the information we collect to create a profile about you that may reflect, for example, your credit risk profile, your preferences, characteristics, shopping habits, and other behavior, to enhance our Services to you and maintain a trusted environment;— Excerpt from Cash App's Cash App Privacy Policy
1) REGULATORY LANDSCAPE: The CCPA/CPRA grants California residents the right to opt out of automated decision-making and profiling, enforced by the California Privacy Protection Agency. The FTC Act applies to AI-based profiling that could constitute unfair or deceptive practices. Emerging state AI and automated decision-making laws in Colorado, Connecticut, and other jurisdictions may impose additional disclosure and opt-out obligations. The use of financial transaction data for AI training may also interact with GLBA limitations on secondary use of nonpublic personal information. 2) GOVERNANCE EXPOSURE: High. The authorization to train AI models on personal data without a clear opt-out mechanism described in the notice, combined with the inference-drawing provision to build behavioral and credit risk profiles, creates material exposure under CCPA/CPRA profiling opt-out requirements and emerging state AI transparency regulations. The notice does not specify which data categories are excluded from AI training use. 3) JURISDICTION FLAGS: California residents have the strongest existing opt-out rights for automated profiling under CPRA. Colorado (CPA), Connecticut (CTDPA), and Texas (TDPSA) residents may have profiling opt-out rights depending on the nature of decisions made. Users subject to credit decisions based on AI-inferred profiles may have additional rights under the Fair Credit Reporting Act (FCRA) depending on how inferences are used. 4) CONTRACT AND VENDOR IMPLICATIONS: If AI model training is performed by or in conjunction with third-party vendors, data processing agreements must confirm that personal data used for training is not retained by vendors for their own model development. The notice does not address whether AI models trained on user data are shared with affiliates or third parties, which is a relevant vendor and affiliate contract review consideration. 5) COMPLIANCE CONSIDERATIONS: Compliance teams should evaluate whether the AI training authorization is adequately disclosed for CPRA purposes and whether an opt-out mechanism for profiling and automated decision-making is available and prominently surfaced. A data inventory should confirm which personal data categories flow into AI training pipelines. GLBA secondary use limitations should be reviewed against the scope of AI training described in the notice.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
The authorization to use personal data for AI training is explicit and broad, and the notice does not describe limits on which data categories may be used for this purpose or how long AI-trained models derived from user data are retained.
The policy states that data including transaction history, behavioral data, and inferred characteristics may be used to train AI models and build profiles reflecting credit risk, preferences, and shopping habits; California residents have a right to opt out of profiling under the CPRA, and users in other states with profiling opt-out rights under applicable state laws may also be entitled …
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cash App.