Strava uses your personal information including health data and GPS location to train its AI and machine learning systems, though the extent depends on your in-app privacy settings.
This analysis describes what Strava's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This clause authorizes use of sensitive data categories including health metrics and precise location for AI development, which is a broad permission that goes beyond basic service delivery and may not be fully intuitive to users who think of Strava as a workout tracker.
Interpretive note: The policy states that AI training uses depend on 'privacy controls and sharing permissions' but does not specify exactly which settings limit AI training uses of health data, making it difficult for users to know precisely what to adjust.
Your location history, heart rate, and other health data may be used to train Strava's AI models, with the scope depending on your privacy controls; users who do not actively adjust these settings may be contributing more data to AI development than they realize.
How other platforms handle this
We use your personal data to develop, train, and improve our artificial intelligence and machine learning models. This includes using your transaction data, behavioral data, and interaction data to enhance our fraud detection, credit assessment, and personalization capabilities. We take steps to pro...
We use Personal Data to detect and prevent fraud, and to develop and improve our fraud detection models and other machine learning systems. This may include using transaction data, device information, and other Personal Data to train and refine our systems.
Writer does not use Customer Data to train its AI models without explicit customer permission. Customer Data means the data, content, and information that customers and their end users submit to or through the Services.
Monitoring
Strava has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"We use information to enhance the quality, reliability, and/or accuracy of our AI Features by creating, developing, training, testing, improving, and maintaining AI and ML models run by Strava or our service providers. We use aggregated, de-identified data for this purpose. We also use personal information, including health and Location Information, for AI Features, depending on your privacy controls and sharing permissions.— Excerpt from Strava's Strava Privacy Policy
REGULATORY LANDSCAPE: This provision engages GDPR Article 6 (lawful basis for processing) and Article 9 (health data as special category), CCPA and CPRA sensitive personal information provisions, and emerging EU AI Act requirements for AI systems trained on personal data. The FTC's guidance on AI and data practices is also relevant. Enforcement authorities include the European Data Protection Board, national EU supervisory authorities, the California Privacy Protection Agency, and the FTC. GOVERNANCE EXPOSURE: High. The authorization to use personal health and location data for AI training is broad and conditional on privacy controls that many users may not actively manage. Under GDPR, training AI models on special category data such as health information requires a valid legal basis beyond legitimate interests, typically explicit consent. The policy does not clearly specify the legal basis for this processing for EEA users, which represents a material compliance gap. JURISDICTION FLAGS: EEA and UK users face the highest regulatory exposure given GDPR Article 9 requirements for health data processing. California CPRA grants consumers the right to limit use of sensitive personal information, which likely encompasses health and precise location data used for AI training. Washington, Colorado, and Connecticut state privacy laws impose similar restrictions. The EU AI Act may impose additional transparency and documentation requirements depending on how Strava's AI systems are classified. CONTRACT AND VENDOR IMPLICATIONS: Service providers operating AI and ML models on Strava's behalf must be covered by data processing agreements that specify permitted uses, data minimization requirements, and retention limits. Procurement teams should assess whether these vendors have adequate technical controls to enforce the deidentification and aggregation claims made in the policy. COMPLIANCE CONSIDERATIONS: Legal teams should document the specific legal basis asserted for AI training uses of health and location data in each jurisdiction, and verify that consent mechanisms are sufficiently granular to cover this use case. Privacy impact assessments for AI training pipelines involving health data should be conducted or reviewed. Data subject rights processes should be evaluated to determine whether users can effectively exclude their data from AI training without losing core service functionality.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This clause authorizes use of sensitive data categories including health metrics and precise location for AI development, which is a broad permission that goes beyond basic service delivery and may not be fully intuitive to users who think of Strava as a workout tracker.
Your location history, heart rate, and other health data may be used to train Strava's AI models, with the scope depending on your privacy controls; users who do not actively adjust these settings may be contributing more data to AI development than they realize.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Strava.