Before drafting contract language, establishing clear definitions prevents future disputes. Courts and arbitrators will look to how parties defined these terms when interpreting agreements.
Understanding the distinctions between these three related but legally distinct concepts is critical for proper rights clearance and talent compensation.
Legal Definition: A computer-generated representation of an actual person’s voice, likeness, or performance, created using that person’s actual data.
Key Characteristics:
- Derived from scanning or recording an actual person
- Intended to replicate that specific person’s appearance, voice, or performance
- Recognizable as the specific individual to a reasonable observer
Legal Requirements (SAG-AFTRA):
- Explicit written consent for each intended use
- Additional compensation beyond base fee
- Scope limitations (media, territory, duration)
- Performer approval rights
Synthetic Performer:
Legal Definition: An entirely AI-generated or algorithmically rendered character that resembles a human but is not based on any identifiable person’s data or likeness.
Key Characteristics:
- Created wholly from artificial or composite data.
- May be designed to evoke realism but not a specific individual.
- Does not rely on biometric capture or motion data from a real person.
- May feature invented voices, appearances, or personalities synthesized by AI.
Production Companies (SAG-AFTRA):
- Must provide notice to the union and an opportunity to bargain.
- Require consent to create an asset with a principal feature that is recognizable as that of a specific performer.
Legal Ambiguity:
- Personality rights vs. creativity: Some argue synthetic performers fall outside right-of-publicity laws because they do not depict real people.
- Replacement risk: Others contend that when such avatars mimic an identifiable person’s traits or replace human performers, they should still trigger consent and compensation obligations.
- Attribution and authorship: Raises questions about who owns the performer’s identity — the production company, the AI developer, or the creative team.
Contract Drafting Considerations:
- Define whether synthetic performers may appear alongside or instead of human performers.
- Require disclosure and notice to SAG-AFTRA.
- Address ownership, reuse, and moral rights for synthetic characters that gain commercial value (e.g., digital influencers or virtual brand ambassadors).
Legal Definition: A visual-effects (VFX) or computer-generated stand-in that replicates an actor’s physical form for specific scenes — typically used in stunts, background crowd work, or complex shots.
Key Characteristics:
- Traditionally created through manual CGI techniques.
- Based on performance or likeness capture of a real actor.
- Limited to a defined use case (e.g., dangerous stunt, unavailable reshoot).
- Intended to supplement, not replace, a performer’s overall role.
AI Era Shift:
- Modern tools can now generate or manipulate digital doubles using AI-assisted techniques (e.g., body doubles trained on prior footage).
- This blurs the line between “traditional VFX” and “digital replica” — potentially invoking the same consent and compensation rules as full digital replicas.
Contract Drafting Considerations:
- Specify whether AI-assisted doubles count as “digital replicas” under SAG-AFTRA rules.
- Require actor approval for any use beyond originally captured scenes.
- Define retention and destruction timelines for double assets (e.g., “digital doubles must be retired or deleted upon contract expiration”).
- Address E&O implications — unapproved digital doubles can trigger insurer rejections or claims.
Takeaway
While all three—digital replicas, synthetic performers, and digital doubles—may appear similar on screen, the source of the data and the scope of use determine which legal framework applies. Failing to distinguish among them in contracts can create downstream disputes over ownership, consent, compensation, and credit — especially as AI tools increasingly automate what once required human artistry.
Need help reviewing your contracts for AI gaps?
Schedule a 30-minute consultation to discuss your specific situation.
Training Data vs. Derived Outputs vs. Underlying Works
Understanding AI ownership requires distinguishing what goes into the AI system, what comes out, and what remains human-created.
Training Data:
Definition: Materials (such as images, video, audio, scripts, texts, or motion data) fed into AI systems to teach large language models (LLMs) how to generate new content.
Key Legal Issues:
- Copyright: Whether using copyrighted materials as training data constitutes infringement remains unresolved in U.S. courts.
- Right of Publicity: Using an identifiable person’s likeness, voice, or performance data in AI training require explicit consent.
- Contractual Control: Ownership of and authorization to use training data must be clearly defined – particularly when production materials are jointly owned or licensed.
Contract Drafting Considerations:
- Define “Training Data” broadly to include all production-related audiovisual and performance data.
- Prohibit use of production materials in AI model training unless explicitly licensed.
- Include representations and warranties confirming no party has used AI training data without prior written approval.
- Require disclosure of any third-party AI tools that may retain or learn from project data.
Derived Outputs:
Definition: Content generated by AI systems after training (AI-generated text, images, video, audio).
Critical Copyright Question: U.S. Copyright Office guidance (2024): AI-generated content with minimal human input is not copyrightable. AI-assisted content with substantial human authorship is copyrightable.
Practical Implications:
- Purely AI-generated dialogue may be public domain
- AI-generated establishing shots may have no copyright protection
Third parties can freely use uncopyrightable AI outputs.
Underlying Works:
Definition: Human-created, copyrightable elements—such as screenplays, performances, cinematography, editing, and musical compositions—that exist independently of AI generation. These works form the legal foundation of a production’s intellectual property chain and determine who owns the copyright and how derivative uses are treated.
Key Legal Issues:
- Authorship and Copyrightability: The U.S. Copyright Office requires “human authorship” for registration. AI-assisted or primarily AI-generated works may lack protection unless a human contributed sufficient creative input.
- Joint Works and Derivatives: When AI tools are used collaboratively (e.g., script rewrites, visual composites), the resulting work may constitute a joint work or derivative work—each with distinct ownership and licensing implications.
- Chain of Title: Clear delineation between human-authored and AI-generated content is essential for securing financing, distribution, and errors & omissions (E&O) insurance. Unclear AI authorship or consent gaps can jeopardize E&O coverage.
- Credit and Compensation: Guilds and unions do not yet recognize AI-assisted contributions as creditable. Contract language should anticipate credit disputes and define whether AI output qualifies as “material” under credit determinations.
Contract Drafting Considerations:
- Define Underlying Works as the human-authored components forming the creative base of the production.
- Require written disclosure of any AI-assisted content incorporated into the final work.
- Specify ownership of AI-modified or AI-enhanced versions of the original human-authored material.
- Include representations confirming that any submitted material claiming copyright protection was not solely generated by AI.
- Clarify whether the producer or author retains rights to AI-derived adaptations, visualizations, or reformats of the underlying human work.
AI Post-Production vs. Traditional VFX
The line between “enhancement” and “alteration” has legal consequences for approval rights and compensation.
Generally Permitted Without Additional Approval:
- Color correction and grading
- Standard noise reduction
- Sharpening and clarity enhancement
- Audio cleanup (removing background noise)
Requires Talent Approval:
- Changing dialogue content or delivery
- Altering facial expressions or emotional tone
- De-aging or age-progression effects
- Synthesizing new dialogue (AI ADR)
- Compositing performance into scenes where talent didn’t actually perform
Traditional VFX involved human artists frame-by-frame creating effects, enhancements, or alterations. AI post-production tools can now automatically de-age actors, change backgrounds, enhance resolution, remove objects, or even alter performances with minimal human input.
The critical distinction: approval rights and creative control. Who must approve AI-driven alterations to an actor’s performance? If an editor uses AI to “improve” line delivery by synthesizing alternate takes, has the actor’s performance been materially altered? Contracts must define these thresholds.
Takeaway for Talent and Creators
AI doesn’t erase existing laws – it expands their reach. In New York, unauthorized digital replicas are not creative experiments—they’re potential violations of long-standing civil rights statutes. Producers, talent managers, and distributors should treat consent, scope, and sophisticated contract drafting as the new compliance triad for the AI era.
Ready to protect your production? Contact Rodriques Law to review your contracts, identify AI gaps, and develop strategy.