Cash App · Cash App Privacy Policy · View original document ↗

AI and Machine Learning Training Using Personal Data

High severity Medium confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Recent governance activity Cash App recorded 7 documented changes in the last 30 days.
Start monitoring updates
Monitor governance changes for Cash App Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Cash App states it may use your personal data including transaction history, behavioral data, and profile information to train AI and machine learning models and to draw inferences that build a profile about your credit risk, preferences, and shopping habits.

This analysis describes what Cash App's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

The authorization to use personal data for AI training is explicit and broad, and the notice does not describe limits on which data categories may be used for this purpose or how long AI-trained models derived from user data are retained.

Interpretive note: The notice does not specify which data categories are included in or excluded from AI training, and the scope of profiling opt-out rights available to non-California users depends on the applicable state law framework.

Recent Activity

This document changed recently

Medium Apr 19, 2026

The updated policy establishes that children under 13 may use Cash App services if a parent or guardian signs up for or authorizes the account on their behalf. Previously, the policy explicitly prohi…

Medium Apr 10, 2026

The revised policy shifts from prohibiting all children under 13 from using Cash App to permitting use when a parent or guardian explicitly authorizes or signs up for the service on the child's behal…

Consumer impact (what this means for users)

The policy states that data including transaction history, behavioral data, and inferred characteristics may be used to train AI models and build profiles reflecting credit risk, preferences, and shopping habits; California residents have a right to opt out of profiling under the CPRA, and users in other states with profiling opt-out rights under applicable state laws may also be entitled to limit this use.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Navigate to the 'Your Rights and Choices' section of the Cash App Privacy Notice and submit a request to limit use or opt out of profiling and automated decision-making where that right is available in your state.

How other platforms handle this

Klarna Medium

We use your personal data to develop, train, and improve our artificial intelligence and machine learning models. This includes using your transaction data, behavioral data, and interaction data to enhance our fraud detection, credit assessment, and personalization capabilities. We take steps to pro...

Stripe Medium

We use Personal Data to detect and prevent fraud, and to develop and improve our fraud detection models and other machine learning systems. This may include using transaction data, device information, and other Personal Data to train and refine our systems.

Writer Medium

Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.

See all platforms with this clause type →

Monitoring

Cash App has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
Improving, personalizing and facilitating your use of our Services, content and applications, including by training artificial intelligence (AI) and other machine learning models; Drawing inferences from any of the information we collect to create a profile about you that may reflect, for example, your credit risk profile, your preferences, characteristics, shopping habits, and other behavior, to enhance our Services to you and maintain a trusted environment;

— Excerpt from Cash App's Cash App Privacy Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

1) REGULATORY LANDSCAPE: The CCPA/CPRA grants California residents the right to opt out of automated decision-making and profiling, enforced by the California Privacy Protection Agency. The FTC Act applies to AI-based profiling that could constitute unfair or deceptive practices. Emerging state AI and automated decision-making laws in Colorado, Connecticut, and other jurisdictions may impose additional disclosure and opt-out obligations. The use of financial transaction data for AI training may also interact with GLBA limitations on secondary use of nonpublic personal information. 2) GOVERNANCE EXPOSURE: High. The authorization to train AI models on personal data without a clear opt-out mechanism described in the notice, combined with the inference-drawing provision to build behavioral and credit risk profiles, creates material exposure under CCPA/CPRA profiling opt-out requirements and emerging state AI transparency regulations. The notice does not specify which data categories are excluded from AI training use. 3) JURISDICTION FLAGS: California residents have the strongest existing opt-out rights for automated profiling under CPRA. Colorado (CPA), Connecticut (CTDPA), and Texas (TDPSA) residents may have profiling opt-out rights depending on the nature of decisions made. Users subject to credit decisions based on AI-inferred profiles may have additional rights under the Fair Credit Reporting Act (FCRA) depending on how inferences are used. 4) CONTRACT AND VENDOR IMPLICATIONS: If AI model training is performed by or in conjunction with third-party vendors, data processing agreements must confirm that personal data used for training is not retained by vendors for their own model development. The notice does not address whether AI models trained on user data are shared with affiliates or third parties, which is a relevant vendor and affiliate contract review consideration. 5) COMPLIANCE CONSIDERATIONS: Compliance teams should evaluate whether the AI training authorization is adequately disclosed for CPRA purposes and whether an opt-out mechanism for profiling and automated decision-making is available and prominently surfaced. A data inventory should confirm which personal data categories flow into AI training pipelines. GLBA secondary use limitations should be reviewed against the scope of AI training described in the notice.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has authority over AI-based profiling and automated decision-making practices that may constitute unfair or deceptive acts under the FTC Act
    File a complaint →
  • CFPB
    The CFPB has authority over automated credit risk profiling practices that interact with consumer financial product eligibility determinations
    File a complaint →

Provision details

Document information
Document
Cash App Privacy Policy
Entity
Cash App
Document last updated
May 5, 2026
Tracking information
First tracked
May 7, 2026
Last verified
May 12, 2026
Record ID
CA-P-011243
Document ID
CA-D-00076
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
4059d89cdc63408c5adcd690e82cb0b567a1b312f1966010d4ced9f9938b69c3
Analysis generated
May 7, 2026 06:31 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Cash App
Document: Cash App Privacy Policy
Record ID: CA-P-011243
Captured: 2026-05-07 06:31:37 UTC
SHA-256: 4059d89cdc63408c…
URL: https://conductatlas.com/platform/cash-app/cash-app-privacy-policy/ai-and-machine-learning-training-using-personal-data/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
High
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Cash App's AI and Machine Learning Training Using Personal Data clause do?

The authorization to use personal data for AI training is explicit and broad, and the notice does not describe limits on which data categories may be used for this purpose or how long AI-trained models derived from user data are retained.

How does this clause affect you?

The policy states that data including transaction history, behavioral data, and inferred characteristics may be used to train AI models and build profiles reflecting credit risk, preferences, and shopping habits; California residents have a right to opt out of profiling under the CPRA, and users in other states with profiling opt-out rights under applicable state laws may also be entitled …

Is ConductAtlas affiliated with Cash App?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Cash App.