Track 1 platform and get the weekly governance digest. No credit card required.
This page describes what the document states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability may vary by jurisdiction. Methodology
This document sets the rules for developers who use Google's Gemini API and AI Studio to build applications and access AI models. By default, Google states it may use the prompts, inputs, and outputs developers send through the API to improve its AI models, unless the developer actively turns off this setting in the API console. Developers who build apps using this API and handle data from real users should review their own privacy obligations and check whether their project's data-use setting is configured as intended.
This document governs developer access to and use of the Gemini API and Google AI Studio, operating as Additional Terms of Service that supplement the Google APIs Terms of Service and, where applicable, the Google Cloud Platform Terms of Service. The agreement states that developers grant Google a license to use submitted content to provide, improve, and develop Google products and services, and the terms authorize Google to use API inputs and outputs for model improvement unless developers opt out via a designated setting; the agreement also states that developers are responsible for ensuring end-user consent where required and for compliance with applicable law when deploying API-powered applications. Notably, the terms assert a broad content license for model improvement that applies by default and requires affirmative opt-out rather than opt-in, and the agreement disclaims warranties and caps Google's liability at amounts paid in the prior twelve months, provisions that are common in developer API agreements but whose enforceability may be constrained under applicable law depending on jurisdiction. The document engages GDPR and related EU data protection frameworks, the EU AI Act, CCPA, and general FTC unfair and deceptive practices authority, particularly given the default data-use posture and the AI-specific obligations applicable to developers who deploy Gemini models in end-user-facing products; compliance exposure depends significantly on whether developers are classified as data controllers or processors under applicable privacy law and on the use-case risk tier under the EU AI Act.
Institutional analysis available with Professional
Regulatory exposure by statute, material risk assessment, vendor due diligence action items, and enforcement precedent. Available on Professional.
Start Professional free trialMonitoring
Google AI Studio has updated this document before.
Watcher includes same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
Professional Governance Intelligence
Need provision-level monitoring and regulatory mapping?
Professional includes governance timelines, compliance memos, audit-ready analysis, and full provision tracking.
Start Professional free trialCross-platform context
See how other platforms handle Default Model Training on API Inputs and Outputs and similar clauses.
Compare across platforms →Governance Monitoring
Structured alerts for policy changes, governance events, and provision updates across 318+ platforms.