Cisco may use your authentication logs and other personal data to train and improve its AI and machine learning systems embedded in Duo and other Cisco products.
This analysis describes what Duo Security's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
Your authentication behavior data, including login patterns, device types, and application access, may contribute to training AI models, which raises data minimization and purpose limitation questions under privacy frameworks like GDPR.
Interpretive note: The policy does not specify whether authentication data is anonymized or aggregated before AI model training, creating ambiguity about the actual privacy impact of this provision.
Authentication event data you generate using Duo may be used to train Cisco's internal AI models, potentially extending the use of your data beyond the direct delivery of authentication services.
How other platforms handle this
We are simplifying our Terms of Use, including clarifications around the use of AI tools, and their data use. We have moved the terms that describe AI Features, which were previously written for a Creator audience and located under the AI-Based Tools Supplemental Terms and Disclaimer, into the User ...
We may use machine learning and other artificial intelligence (AI) technologies ("AI Technologies") to provide and improve our Service. For example, we may use such AI Technologies to analyze and process your contributions and interactions to provide you with personalized experiences, content recomm...
We use Personal Data to detect and prevent fraud, and to develop and improve our fraud detection models and other machine learning systems. This may include using transaction data, device information, and other Personal Data to train and refine our systems.
Monitoring
Duo Security has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"We may use personal data to develop, improve, and support our products and services. This may include using data to train and improve artificial intelligence and machine learning models used in our products.— Excerpt from Duo Security's Duo Privacy
REGULATORY LANDSCAPE: GDPR Article 5 requires that personal data be collected for specified, explicit, and legitimate purposes and not further processed in a manner incompatible with those purposes. Using authentication logs for AI model training may require a separate legal basis or compatibility assessment under GDPR. The EU AI Act, depending on the risk classification of Cisco's AI systems, may impose additional transparency and documentation obligations. The FTC has signaled heightened scrutiny of AI training data practices under its unfair or deceptive practices authority. GOVERNANCE EXPOSURE: Medium. The AI training use case is described at a high level of generality without specifying whether data is anonymized or aggregated before use, what model types are trained, or whether users can opt out. This lack of specificity may create tension with GDPR transparency requirements and emerging AI governance expectations. JURISDICTION FLAGS: EEA users may have grounds to object to AI model training as a form of further processing under GDPR Article 21 if Cisco relies on legitimate interests as the legal basis. UK ICO guidance on AI and data protection is also relevant. California residents may have CPRA rights related to the use of their data for machine learning if it constitutes profiling. CONTRACT AND VENDOR IMPLICATIONS: Enterprise DPAs should be reviewed to determine whether AI model training is listed as a permitted use of customer data processed under the enterprise agreement, or whether it is restricted to anonymized or aggregated data only. Vendors providing Duo to regulated industries should clarify this scope explicitly. COMPLIANCE CONSIDERATIONS: Legal and privacy teams should request clarification from Cisco on whether authentication data used for AI training is anonymized or pseudonymized prior to use, and whether enterprise customer data is segregated from consumer data in AI training pipelines. If Cisco relies on legitimate interests for this processing, a Legitimate Interests Assessment should be available to enterprise customers upon request.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
How 10 AI platforms describe the use of user data for model training, improvement, and development, based on archived governance provisions.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
Your authentication behavior data, including login patterns, device types, and application access, may contribute to training AI models, which raises data minimization and purpose limitation questions under privacy frameworks like GDPR.
Authentication event data you generate using Duo may be used to train Cisco's internal AI models, potentially extending the use of your data beyond the direct delivery of authentication services.
ConductAtlas has identified this type of provision across 2 platforms. See the full comparison.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by Duo Security.