Imagine discovering a trailer for a film you never auditioned for—yet there you are, crying, laughing, delivering lines. An AI-generated double has taken your place, without consent. Overnight, your reputation, career trajectory, and livelihood are hijacked.
This isn’t a distant dystopia. It’s already here.
Artificial intelligence has made it possible to convincingly replicate human faces, voices, and even mannerisms with a few minutes of source material. From de-aging major stars to resurrecting deceased performers, AI is reshaping the boundaries of performance. But when those replicas are used without permission, the line between innovation and exploitation blurs.
For actors, the risks are immense: reputational harm, loss of future work, and uncompensated use of their image. For producers, the risks include lawsuits, bad press, and union disputes. The legal system is only beginning to catch up. It’s not just about compensation and copyright, but image and likeness manipulation and other fears.
During the 2023 Hollywood strikes, artificial intelligence became a flashpoint. SAG-AFTRA, the union representing actors, fought to secure guardrails around digital replicas. Their position is clear: consent, compensation, and control.
The new union agreements reflect these principles, but enforcement ultimately depends on what’s in individual contracts.
Real-world disputes underscore why these protections matter. During the 2023 strikes, several background actors reported that their body scans and likenesses were being stored for potential AI reuse—sometimes described as being used “for the rest of eternity”—without clear terms or additional compensation. In 2025, SAG-AFTRA escalated its pushback when it filed an unfair labor practice charge against Llama Productions over the use of an AI-generated version of James Earl Jones’ voice for the Darth Vader character in the video game Fortnite. The union argued that replicating a deceased performer’s voice without bargaining not only violated SAG-AFTRA member rights but also deprived living, human performers of potential work.
At the federal level, lawmakers introduced the NO FAKES Act (Nurture Originals, Foster Art, and Keep Entertainment Safe). If passed, it would prohibit creating or distributing a digital replica of a living or deceased person in performance without consent.
In addition to the NO FAKES Act, Congress is considering other measures to regulate synthetic media. The COPIED Act (Content Origin Protection and Integrity from Edited and Deepfaked Media Act), supported by SAG-AFTRA, would establish federal transparency guidelines for AI-generated content. The union has also endorsed the Preventing Deepfakes of Intimate Images Act and the AI Labeling Act, which would target harmful nonconsensual uses and require clear disclosure when material is AI-created.
Together, these proposed federal laws aim to fill a major gap: right of publicity protections currently vary state by state. California and New York provide strong safeguards, but other jurisdictions remain far less clear. A consistent national standard could give performers a stronger and more predictable legal shield across the country.
Until then, contracts remain the frontline defense.
Deepfake disputes touch multiple areas of law:
Without uniform rules, every production risks entering murky territory. That’s why proactive agreements are essential.
These risks are not hypothetical. In 2024, actress Scarlett Johansson confronted OpenAI after the company released an AI-generated voice that closely resembled her own. Johansson took legal action, arguing that the use of a soundalike without her consent violated her rights. SAG-AFTRA supported her position, and OpenAI ultimately withdrew the voice option. The controversy illustrated how quickly deepfake disputes can implicate rights of publicity, false endorsement, and reputational harm—all before clear federal standards are in place.
Deepfake and digital likeness issues are not confined to Hollywood. Around the world, governments are beginning to confront the same risks of exploitation, misinformation, and loss of creative control.
For multinational productions, these differences matter. A project distributed globally may be subject to multiple—and sometimes conflicting—legal regimes. Producers and performers alike should consider not just U.S. contract clauses, but also how their rights and obligations might shift once content crosses borders.
Digital replicas also raise new chain of title challenges. A film or series cannot be cleanly distributed unless all rights are properly cleared, and that now includes AI-driven likeness rights. If a performer’s digital double is used without written consent or fair compensation, it can cloud ownership of the finished work, jeopardize insurance, and even prevent distributors from acquiring the project. For producers, this makes documenting consent and compensation in contracts not just a creative issue, but a critical part of the project’s copyright ownership and marketability.
These issues also tie directly into risk management. Standard Errors & Omissions (E&O) policies are beginning to adapt to AI and likeness claims, but they may not cover every scenario. Completion bond companies may now require proof that all digital replica rights are secured before they guarantee delivery, since unresolved likeness disputes can derail a project. Alongside vendor warranties and indemnities, these tools are becoming essential for ensuring that financing and distribution move forward without costly surprises.
Producers embracing AI technology need to protect both themselves and their talent. Strong contracts should include:
These provisions protect productions from lawsuits while signaling respect for creative collaborators.
If the producer has your image, and they can manipulate it any way they want, why do they need to hire you again? Actors should approach every new project with AI clauses top of mind. Key steps include:
Until Congress passes a nationwide law, disputes over digital doubles will be handled piecemeal, state by state. Studios may continue pushing for broader rights, while talent will demand stronger protections. Litigation is inevitable.
But the future of digital doubles doesn’t have to be adversarial. When balanced contracts respect consent, compensation, and control, AI can expand creative possibilities without undermining human performers.
For a related perspective on how AI is reshaping the entertainment industry, see our article Who Owns AI-Generated Content? When Copyright Law Meets Creative Technology. Together, these issues highlight both sides of the challenge: protecting your creative works and safeguarding your personal likeness.
A deepfake contract is a legal agreement that sets limits on how an actor’s voice, image, or likeness may be digitally replicated with artificial intelligence. These contracts typically require consent, outline compensation, and restrict the scope and duration of AI use.
Yes. Under current SAG-AFTRA agreements and many state right-of-publicity laws, performers can decline to have their likeness digitally replicated. The key is making sure the contract is written as opt-in, not opt-out.
Unauthorized use can trigger claims for violation of the right of publicity, false endorsement under the Lanham Act, copyright infringement, or even defamation if the content damages the actor’s reputation. Lawsuits can also lead to bad press and union disputes.
Producers should include clear consent clauses, negotiate fair compensation terms, and secure warranties from AI vendors confirming their tools do not infringe on third-party rights. Errors & Omissions insurance that specifically covers AI-related claims is also recommended.
Deepfakes are no longer a futuristic worry—they are today’s legal and ethical battleground. Actors must defend their likeness, and producers must manage new risks without stifling creativity. The most effective tool remains the same: carefully drafted contracts that put consent, compensation, and control in writing.
For performers, that means redlining overbroad “all rights” clauses and demanding clear limits on AI use. For producers, it means negotiating vendor warranties, securing proper insurance, and respecting talent’s digital rights.
When balanced agreements are in place, AI can expand creative possibilities instead of undermining human performers. The key is to treat contracts not as routine paperwork, but as the frontline defense in the age of digital doubles.
Navigating these issues requires more than awareness—it requires strong legal guidance. Rodriques Law, PLLC helps actors, producers, and creatives draft contracts that balance innovation with protection.