You’re the head of L&D at a mid-sized company. Six months ago, you convinced your star product trainer to sit for an hour of recording so you could create an AI avatar of him. The avatar has been fantastic—it’s delivered the same product training to 500 employees in four languages, saving thousands in production costs.
Last week, he gave his notice. He’s joining a competitor.
Now your legal team is asking questions you don’t have answers to: Can we keep using his avatar? What if we need to update the training module? Do we need to delete everything and start over?
Why Real People Instead of Stock Avatars?
Before we get into the permission nightmare, let’s address the obvious question: Why not just use the stock avatars that platforms like Synthesia and HeyGen offer? They’re ready to go, no permissions needed, and they look increasingly realistic.
Here’s why organizations keep choosing real people:
Trust and credibility matter. When your Chief Medical Officer’s face delivers compliance training, employees pay attention differently than when a generic avatar does it. Universities have found that students engage more with courses featuring their actual professors, even in avatar form. The familiarity creates connection.
Brand identity is valuable. Your organization’s subject matter experts are part of your brand. Having your actual sales director present product training—even as an avatar—reinforces organizational culture in a way that Stock Avatar #47 never will.
Authenticity drives results. Research on avatar-based training consistently shows higher completion rates and better retention when the avatar represents a real person that learners know or recognize.
The irony is that the very reasons organizations want to use real people—trust, credibility, authenticity—are exactly what create the legal complications we’re about to discuss.
The Promise vs. The Reality of AI Avatars
Companies and universities rushed to adopt AI avatar technology for training. The pitch was compelling: Record your expert once, use them forever. Create multilingual content instantly. Scale your training without scaling your team.
The technology moved faster than anyone’s legal framework.
In my client work, I’ve seen this same scenario play out repeatedly—organizations create avatars first, ask permission questions later, then realize they never got the right releases for what they’re actually doing.
This blows up in three main contexts:
Corporate training departments create employee avatars for onboarding, compliance, and product training—then those employees leave, get promoted, or object to how their likeness is being used.
Universities create professor avatars for MOOCs and online courses—then have to navigate faculty union agreements, intellectual property claims, and questions about what happens when professors retire or die.
Third-party training vendors create subject matter expert avatars—then face questions about whether they actually have the rights to resell or sublicense those digital likenesses.
The common thread? Everyone assumed the technology vendor’s terms of service would cover them. They don’t.
What the Law Actually Says (And Doesn’t Say)
Here’s what makes this complicated: the legal framework for AI avatars is a patchwork of state laws that weren’t written with this technology in mind.
The right of publicity—your legal right to control commercial use of your name, image, voice, and likeness—varies dramatically by state. California, Tennessee, and New York have the strongest protections, and if you’re operating nationally, you need to comply with the most restrictive state’s requirements.
Tennessee’s ELVIS Act (yes, really) specifically addresses AI voice and likeness. It prohibits knowingly publishing an individual’s voice or likeness without authorization and even creates liability for distributing technology designed to produce unauthorized replicas. The law applies whether the person is living or deceased.
California AB 2602, effective January 2025, renders unenforceable any contract for digital replicas that doesn’t include a “reasonably specific description of the intended uses.” It also requires that individuals be professionally represented by legal counsel or a labor union when negotiating these contracts—at least for performers and in contexts where the digital replica replaces work they would have otherwise performed in person.
California AB 1836 extends these protections to deceased individuals, requiring consent for any digital replica of a deceased personality’s voice or likeness in audiovisual works.
These laws were written with Hollywood actors in mind. But they apply to your corporate trainer from Accounting just the same.
The problem most organizations face: They’re using vendor platforms with terms of service that shift liability to the customer. Synthesia’s terms make you responsible for having proper consents. So does HeyGen. So does every other platform. When you click “I agree,” you’re representing that you have all necessary rights and permissions.
Do you?
Where Organizations Get Into Trouble
Let me walk you through the four permission problems that catch everyone off guard.
Problem 1: Initial Consent Isn’t Enough
Creating the avatar requires one set of permissions. Using it requires different permissions.
Most organizations get someone to sign a release form or add language to an employment agreement that says “we can use your likeness for training materials.” That covers creating the avatar. It probably doesn’t cover everything you’re going to do with it.
Vidyard’s platform, for example, automatically deletes your avatar, training video, and consent video when you’re removed from an account. But previously created videos remain with the organization. Is that what your employee agreed to? Does your consent form specify that distinction?
The gap I see repeatedly: Consent forms don’t distinguish between creating the avatar (a one-time act) and the ongoing, potentially perpetual use of that avatar across different contexts.
Problem 2: Minor Edits, Major Questions
Training content changes constantly. Policy updates. Product changes. New compliance requirements. Reorganizations that shift reporting structures.
Does editing the script your avatar reads require new consent?
California law says your contract must specify “reasonably specific description of intended uses.” Here’s where that gets tricky: If you got consent to create an avatar for “compliance training on workplace harassment,” can you use that same avatar for “compliance training on data security”?
Legally, maybe. But “compliance training” as a category might not be specific enough under California’s standard.
One client created an avatar for compliance training. Six months later, they wanted to use the same avatar for sales training. Their lawyer said they needed a new release. The employee had left the company. They had to retire the avatar and start over.
Problem 3: The Departed Employee Problem
What happens when someone leaves your organization?
Current practice varies wildly. Some vendors delete everything automatically. Others leave it to the organization to decide. Some companies keep using avatars indefinitely. Others have policies requiring immediate deletion.
Here’s the legal reality: Standard employment contracts don’t address post-employment likeness rights. Work-for-hire provisions cover content you create—documents, code, presentations. They don’t typically cover your face and voice being used after you’re gone.
You have three options:
Perpetual license: You get consent to use the avatar indefinitely, even after employment ends. This is rare, often requires additional compensation, and creates complicated issues if the person joins a competitor. Would you want your former employee’s avatar still representing your company while they’re actively working against you?
Time-limited license: Common approach. The avatar can be used for X months or years after departure. This creates planning challenges—you need to know when to sunset content and create replacements.
Immediate deletion: Legally safest, operationally disruptive. When someone gives notice, you have two weeks to replace all content featuring their avatar.
For universities, this gets even more complex. Professors have more employment protections than corporate employees. Many can make credible claims that course materials—including avatars used to teach those courses—are their intellectual property, not the university’s. Faculty unions may have bargaining rights over digital likeness use.
And then there’s the question nobody wants to ask: What happens when a professor dies? Do you keep using their avatar? Do their heirs have rights? California and Tennessee laws explicitly address post-mortem rights to publicity, which can last for decades.
Problem 4: Scope Creep and Derivative Uses
You got permission to create a training avatar for Module A on product certification.
Now you want to use it for:
- Module B (same product line, different certification level)
- Internal marketing materials promoting the training program
- Customer-facing support videos
- Social media content highlighting your training capabilities
- A case study for the AI vendor who built your avatar
Each one of these represents a potentially different use requiring separate permission.
The vendor terms don’t help you here. They protect the vendor, not you. If you exceed the scope of your consent and someone sues, that’s your problem.
Building a Permissions Framework That Doesn’t Fall Apart
Here’s what actually works, based on patterns I’ve seen from organizations that got this right.
Before You Create Any Avatars
Create a comprehensive consent and release form. Not a one-paragraph addition to your employee handbook. An actual document that specifies:
- Specific description of intended uses (not “training materials” but “Product X certification training modules for internal employees and authorized partner organizations”)
- Duration of permitted use (during employment plus X months/years after, or perpetual with compensation)
- Geographic scope (internal use only vs. public-facing vs. global distribution)
- Derivative use permissions (can we create variations? Can we use the avatar for different subject matter?)
- Compensation structure if any (one-time payment, ongoing royalty, additional payment for extended uses)
- Post-employment/post-departure terms explicitly stated
- Update and modification rights (can we edit scripts? Can we change the avatar’s appearance? How much can we modify before it’s a “new use”?)
- Termination provisions (how can either party end the arrangement?)
For California and New York: Require legal review or union involvement for performers and in any context where the digital replica replaces work someone would have otherwise performed in person. Don’t try to navigate this with a standard form.
Don’t rely on vendor terms alone. Your contract with Synthesia or HeyGen doesn’t create permissions between you and the individuals whose avatars you’re creating. That’s a separate legal relationship you need to document.
Document your governance process:
- Who has authority to approve avatar creation?
- Who approves new uses of existing avatars?
- How do you track where each avatar is being used?
- What’s your process for reviewing and retiring avatars?
- Who monitors for scope creep?
Build in your exit strategy from day one. Don’t create dependencies you can’t unwind. Plan for what happens when people leave. Consider finite-term licenses even for current employees. Build redundancy so you’re not locked into a single person’s digital likeness for critical training.
For Universities Specifically
Clarify IP ownership before recording anything. Who owns the course content? In many institutions, this is genuinely unclear—professors claim ownership, universities claim work-for-hire. Who owns the avatar itself? That’s separate from course content ownership and needs to be specified.
What happens if the professor leaves? Retires? Dies? These aren’t hypothetical questions. They’re planning requirements.
Consider union implications. Faculty unions may have rights to bargain over digital likeness use. Check your collective bargaining agreements before implementing an avatar program. Don’t assume your standard employment contract covers this.
Address the teaching presence question. Should students know they’re learning from an avatar versus the actual professor? The European AI Act, effective since August 2024, requires transparency about AI-generated communications. Disclosure requirements vary by jurisdiction, but the trend is clearly toward requiring notice.
This isn’t just legal compliance—it’s pedagogical. Students may engage differently if they know they’re interacting with an AI representation rather than their actual instructor.
For Corporate Training
Separate consent for employees versus contractors. W-2 employees can have avatar permissions built into employment agreements, though state law still applies and you still need specific language. Contractors and vendors need separate, explicit agreements that spell out scope and duration.
Subject matter experts from other companies are particularly tricky. You need both the individual’s consent and their employer’s sign-off. If they leave that company and join a competitor, your agreement with the original company may no longer provide coverage.
Implement version control and audit trails. Track every version of every avatar use. Document consent for each deployment. Be able to prove compliance if challenged. This isn’t paranoia—it’s basic due diligence given the legal landscape.
The Future Is Already Here
The technology is only getting better and cheaper. More use cases will emerge. Regulations will continue to evolve. We’re likely to see federal legislation in the next few years that creates more uniform standards, but until then, you’re navigating state-by-state requirements.
The organizations that win in this environment aren’t the ones that move fastest. They’re the ones who build consent and governance into their process from the start.
This isn’t just a legal issue—it’s a trust issue with employees and students. When people feel their digital likeness is being used without clear boundaries and proper consent, it damages organizational culture. When they feel respected and fairly compensated, they’re often enthusiastic participants in creating innovative training content.
The key principle: Treat AI avatars like you’d treat someone’s actual face and voice in traditional video—because legally and ethically, that’s exactly what they are.
Start with consent, not convenience. Get the permissions right. Build the governance framework. Then unleash the technology.
Your future self will thank you when you’re not frantically deleting training content because someone’s lawyer sent a cease-and-desist letter.
About the Author
John Rood is the founder of Proceptual, where he helps organizations build practical AI governance systems that actually work. He has taught AI governance at Michigan State University and the University of Chicago, and his writing has appeared in HR Brew and Tech Target. He has spoken at the national SHRM conference and works with organizations ranging from startups to private equity portfolio companies on AI implementation and governance.