OpenAI · GPT-4o System Card (PDF)

Voice Impersonation and Cloning Risk

High severity
Share 𝕏 Share in Share 🔒 PDF

What it is

OpenAI's own safety testers found that GPT-4o's voice feature could be used to impersonate real people, which is why voice output is currently restricted — but these restrictions may not apply in all deployment contexts.

Consumer impact (what this means for users)

Your voice and likeness could be at risk if GPT-4o is used by bad actors to generate convincing audio impersonations; currently OpenAI limits voice output to preset voices, but this restriction is not permanent and may not apply to all operator deployments.

Cross-platform context

See how other platforms handle Voice Impersonation and Cloning Risk and similar clauses.

Compare across platforms →
Need full compliance memos? See Professional →

Why it matters (compliance & risk perspective)

Voice cloning and impersonation capabilities in a widely deployed AI system create serious risks for fraud, non-consensual audio deepfakes, and reputational harm — and OpenAI's red team confirmed these risks exist in GPT-4o.

View original clause language
We are taking a conservative approach with audio outputs, including voice. Currently, voice output is limited to a set of preset voices and is not available in the API. This is partly to prevent misuse of voice generation for impersonation or to create misleading content... Red teamers noted that GPT-4o's voice could be used to generate speech that sounds like a real person.

Institutional analysis (Compliance & legal intelligence)

(1) REGULATORY FRAMEWORK: This provision implicates the FTC Act Section 5 (impersonation as unfair or deceptive practice), the FTC's 2024 impersonation rule (16 CFR Part 461), state right-of-publicity laws (California Civil Code §3344, New York Civil Rights Law §50-51), the EU AI Act Article 52 (transparency obligations for AI-generated audio content), and the DEEPFAKES Accountability Act (proposed but not enacted). The FTC has primary enforcement authority in the US. (2)

🔒

Compliance intelligence locked

Regulatory citations, enforcement risk, and due diligence action items.

Watcher $9.99/mo Professional $149/mo

Watcher: regulatory citations. Professional: full compliance memo.

Applicable agencies

  • FTC
    The FTC's 2024 impersonation rule and Section 5 authority directly cover AI voice cloning used for impersonation or fraud.
    File a complaint →
  • State AG
    State Attorneys General enforce right-of-publicity laws and consumer protection statutes implicated by AI voice impersonation capabilities.
    File a complaint →

Provision details

Document information
Document
GPT-4o System Card (PDF)
Entity
OpenAI
Document last updated
March 5, 2026
Tracking information
First tracked
March 10, 2026
Last verified
April 27, 2026
Record ID
CA-P-003142
Document ID
CA-D-00008
Evidence Provenance
Source URL
Wayback Machine
SHA-256
7c23ef53467eea199596abe78511d57ffee1e94b50ef10ac0f7d81df278b5059
Verified
✓ Snapshot stored   ✓ Change verified
How to Cite
ConductAtlas Policy Archive
Entity: OpenAI | Document: GPT-4o System Card (PDF) | Record: CA-P-003142
Captured: 2026-03-10 03:40:55 UTC | SHA-256: 7c23ef53467eea19…
URL: https://conductatlas.com/platform/openai/gpt-4o-system-card-pdf/voice-impersonation-and-cloning-risk/
Accessed: May 2, 2026
Classification
Severity
High
Categories

Other provisions in this document