Strava · Strava Privacy Policy · View original document ↗

AI and Machine Learning Model Training Using Personal Data

Medium severity Medium confidence Explicitdocumentlanguage Unique · 0 of 325 platforms
Share 𝕏 Share in Share 🔒 PDF
Monitor governance changes for Strava Create a free account to receive the weekly governance digest and monitor one platform for governance changes.
Create free account No credit card required.
Document Record

What it is

Strava uses your personal information including health data and GPS location to train its AI and machine learning systems, though the extent depends on your in-app privacy settings.

This analysis describes what Strava's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology

ConductAtlas Analysis

Why it matters (compliance & governance perspective)

This clause authorizes use of sensitive data categories including health metrics and precise location for AI development, which is a broad permission that goes beyond basic service delivery and may not be fully intuitive to users who think of Strava as a workout tracker.

Interpretive note: The policy states that AI training uses depend on 'privacy controls and sharing permissions' but does not specify exactly which settings limit AI training uses of health data, making it difficult for users to know precisely what to adjust.

Consumer impact (what this means for users)

Your location history, heart rate, and other health data may be used to train Strava's AI models, with the scope depending on your privacy controls; users who do not actively adjust these settings may be contributing more data to AI development than they realize.

What you can do

⚠️ These actions may provide transparency or partial mitigation but may not fully address the underlying issue. Effectiveness varies by jurisdiction and individual circumstances.
  • Delete Your Data
    Log in to Strava and navigate to your Privacy Controls at strava.com/athlete/privacy. Review settings for activity visibility, data permissions, and AI feature participation, and adjust to restrict use of your health and location data for AI development.

How other platforms handle this

Klarna Medium

We use your personal data to develop, train, and improve our artificial intelligence and machine learning models. This includes using your transaction data, behavioral data, and interaction data to enhance our fraud detection, credit assessment, and personalization capabilities. We take steps to pro...

Stripe Medium

We use Personal Data to detect and prevent fraud, and to develop and improve our fraud detection models and other machine learning systems. This may include using transaction data, device information, and other Personal Data to train and refine our systems.

Writer Medium

Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.

See all platforms with this clause type →

Monitoring

Strava has changed this document before.

Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.

Start Watcher free trial Or create a free account →
▸ View Original Clause Language DOCUMENT RECORD
"
We use information to enhance the quality, reliability, and/or accuracy of our AI Features by creating, developing, training, testing, improving, and maintaining AI and ML models run by Strava or our service providers. We use aggregated, de-identified data for this purpose. We also use personal information, including health and Location Information, for AI Features, depending on your privacy controls and sharing permissions.

— Excerpt from Strava's Strava Privacy Policy

ConductAtlas Analysis

Institutional analysis (Compliance & governance intelligence)

REGULATORY LANDSCAPE: This provision engages GDPR Article 6 (lawful basis for processing) and Article 9 (health data as special category), CCPA and CPRA sensitive personal information provisions, and emerging EU AI Act requirements for AI systems trained on personal data. The FTC's guidance on AI and data practices is also relevant. Enforcement authorities include the European Data Protection Board, national EU supervisory authorities, the California Privacy Protection Agency, and the FTC. GOVERNANCE EXPOSURE: High. The authorization to use personal health and location data for AI training is broad and conditional on privacy controls that many users may not actively manage. Under GDPR, training AI models on special category data such as health information requires a valid legal basis beyond legitimate interests, typically explicit consent. The policy does not clearly specify the legal basis for this processing for EEA users, which represents a material compliance gap. JURISDICTION FLAGS: EEA and UK users face the highest regulatory exposure given GDPR Article 9 requirements for health data processing. California CPRA grants consumers the right to limit use of sensitive personal information, which likely encompasses health and precise location data used for AI training. Washington, Colorado, and Connecticut state privacy laws impose similar restrictions. The EU AI Act may impose additional transparency and documentation requirements depending on how Strava's AI systems are classified. CONTRACT AND VENDOR IMPLICATIONS: Service providers operating AI and ML models on Strava's behalf must be covered by data processing agreements that specify permitted uses, data minimization requirements, and retention limits. Procurement teams should assess whether these vendors have adequate technical controls to enforce the deidentification and aggregation claims made in the policy. COMPLIANCE CONSIDERATIONS: Legal teams should document the specific legal basis asserted for AI training uses of health and location data in each jurisdiction, and verify that consent mechanisms are sufficiently granular to cover this use case. Privacy impact assessments for AI training pipelines involving health data should be conducted or reviewed. Data subject rights processes should be evaluated to determine whether users can effectively exclude their data from AI training without losing core service functionality.

Full compliance analysis

Regulatory citations, enforcement risk, and due diligence action items.

Track 1 platform — free Try Watcher free for 14 days

Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.

Applicable agencies

  • FTC
    The FTC has authority over unfair or deceptive practices related to AI data use and consumer privacy, including whether disclosures about AI training uses are adequate
    File a complaint →
  • State AG
    State attorneys general in California, Washington, and Colorado have enforcement authority under state privacy laws covering sensitive data use for AI training
    File a complaint →

Applicable regulations

GDPR
European Union
UK GDPR
United Kingdom

Provision details

Document information
Document
Strava Privacy Policy
Entity
Strava
Document last updated
May 5, 2026
Tracking information
First tracked
May 9, 2026
Last verified
May 9, 2026
Record ID
CA-P-007784
Document ID
CA-D-00272
Evidence Provenance
Source URL
Wayback Machine
Content hash (SHA-256)
1f04cde7030a965e9a65ea78be50fec4717b7bbf6a378112228c49d14a8f6010
Analysis generated
May 9, 2026 22:52 UTC
Methodology
Evidence
✓ Snapshot stored   ✓ Hash verified
Citation Record
Entity: Strava
Document: Strava Privacy Policy
Record ID: CA-P-007784
Captured: 2026-05-09 22:52:22 UTC
SHA-256: 1f04cde7030a965e…
URL: https://conductatlas.com/platform/strava/strava-privacy-policy/ai-and-machine-learning-model-training-using-personal-data/
Accessed: May 13, 2026
Permanent archival reference. Stable identifier suitable for legal filings, compliance documentation, and research citation.
Classification
Severity
Medium
Categories

Other risks in this policy

Related Analysis

Professional Governance Intelligence

Need to monitor specific governance provisions?

Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.

Arbitration clauses AI governance Data rights Indemnification Retention policies
Start Professional free trial

Or start with Watcher →

Built from archived source documents, structured governance mappings, and historical version tracking.

Frequently Asked Questions

What does Strava's AI and Machine Learning Model Training Using Personal Data clause do?

This clause authorizes use of sensitive data categories including health metrics and precise location for AI development, which is a broad permission that goes beyond basic service delivery and may not be fully intuitive to users who think of Strava as a workout tracker.

How does this clause affect you?

Your location history, heart rate, and other health data may be used to train Strava's AI models, with the scope depending on your privacy controls; users who do not actively adjust these settings may be contributing more data to AI development than they realize.

Is ConductAtlas affiliated with Strava?

No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Strava.