OpenAI's standard API services are not set up for healthcare data protected under US law. If you want to use the API with patient health data, you must sign a separate legal agreement with OpenAI first.
This analysis describes what OpenAI's agreement states, permits, or reserves. It does not constitute a legal determination about enforceability. Regulatory applicability and practical outcomes may vary by jurisdiction, enforcement context, and individual circumstances. Read our methodology
This provision places the compliance burden on the operator to identify when HIPAA applies to their use case and to execute a BAA before submitting any protected health information. Using the API with PHI without a BAA in place would constitute a potential HIPAA violation by the operator.
If a business uses OpenAI's API to process health-related personal data about individuals without first executing a HIPAA Business Associate Agreement, that business may be operating in violation of US healthcare privacy law. Individuals whose health data is processed through an API product should ensure the operator has appropriate legal agreements in place.
How other platforms handle this
While the categories of Restricted Content above provide a clear framework, we may also moderate other types of Content in response to evolving challenges posed by advancements in Machine Learning. As we assess such Content, we hold consent as a core value, ensuring our approach remains thoughtful, ...
Mistral AI may monitor use of the Mistral AI Products through automated means in accordance with the Usage Policy. This monitoring is conducted to ensure compliance with Mistral AI's terms and policies, and to maintain the security and integrity of Mistral AI Products. We reserve the right to review...
This Neon Platform Services Product Specific Schedule ("Product Specific Schedule") is entered into as of the Effective Date between Neon, LLC ("Neon" or "we"), an affiliate of Databricks, Inc. ("Databricks"), and Customer (as defined below) ("Customer", "you," or "your") and governs Customer's use ...
Monitoring
OpenAI has changed this document before.
Receive same-day alerts, structured change summaries, and monitoring for up to 10 platforms.
"The parties acknowledge that the Services are not designed for processing Protected Health Information as defined under HIPAA. If Customer wishes to use the Services to process Protected Health Information, Customer must enter into a separate Business Associate Agreement with OpenAI prior to submitting any Protected Health Information through the Services.— Excerpt from OpenAI's OpenAI Data Processing Addendum
REGULATORY LANDSCAPE: HIPAA (45 CFR Parts 160 and 164) requires covered entities and business associates to execute BAAs before sharing protected health information with service providers. HHS Office for Civil Rights (OCR) is the primary enforcement authority. Operators in healthcare, health insurance, or health technology that process PHI via the OpenAI API without a BAA face potential HIPAA enforcement including civil monetary penalties. GOVERNANCE EXPOSURE: High for healthcare-adjacent operators. The DPA explicitly states the standard services are not designed for PHI, which signals that additional technical and contractual controls are required. Operators who submit PHI through the standard API without a BAA may be unable to defend a HIPAA compliance position if data is involved in a breach or audit. JURISDICTION FLAGS: US-based covered entities and business associates under HIPAA face the most direct exposure. Non-US healthcare operators processing data about US patients may also face HIPAA obligations depending on their relationship to US covered entities. CONTRACT AND VENDOR IMPLICATIONS: Healthcare operators and their procurement teams should conduct a PHI data flow assessment before deploying OpenAI's API in clinical, insurance, or health administration contexts. The BAA must be executed and in place before any PHI is submitted. Operators should also assess whether OpenAI's API meets HIPAA technical safeguard requirements independent of the contractual BAA. COMPLIANCE CONSIDERATIONS: Operators in healthcare or health-adjacent sectors should include an OpenAI BAA execution step in their vendor onboarding process, conduct a HIPAA-specific risk assessment for any AI use case involving patient data, and confirm that de-identification procedures are applied before data reaches the standard API if a BAA has not been executed.
Full compliance analysis
Regulatory citations, enforcement risk, and due diligence action items.
Free: track 1 platform + weekly digest. Watcher: 10 platforms + same-day alerts. No credit card required.
Professional Governance Intelligence
Need to monitor specific governance provisions?
Professional includes provision-level monitoring, governance timelines, regulatory mapping, and audit-ready analysis.
Built from archived source documents, structured governance mappings, and historical version tracking.
This provision places the compliance burden on the operator to identify when HIPAA applies to their use case and to execute a BAA before submitting any protected health information. Using the API with PHI without a BAA in place would constitute a potential HIPAA violation by the operator.
If a business uses OpenAI's API to process health-related personal data about individuals without first executing a HIPAA Business Associate Agreement, that business may be operating in violation of US healthcare privacy law. Individuals whose health data is processed through an API product should ensure the operator has appropriate legal agreements in place.
No. ConductAtlas is an independent monitoring service. We are not affiliated with, endorsed by, or sponsored by OpenAI.