NAVA Founders and Voice Actors Tim Friedlander and Carin Gilfry Discuss AI Bills and the Importance of Having an AI Rider

January 15, 2025 | Chris Butera
Photos courtesy of Tim Friedlander and Carin Gilfry.

Unless you’re plugged into the voiceover world, Tim Friedlander and Carin Gilfry may not be names that you’re familiar with. Whether you’ve heard them is another story, as their voices have been, well, everywhere, appearing in everything from commercials to eLearning, to video games and animation.

We could discuss their storied voiceover successes and achievements in the industry, but the mission they’re working on behind the scenes is bigger than their credits.

Friedlander and Gilfry are the heads and co-founders of the National Association of Voice Actors (NAVA), a social-impact non-profit organization that advocates and promotes the advancement of the voice acting industry. It does this through education, inclusion and benefits. NAVA sponsors and speaks at voiceover conferences such as VOcation, hosts free and member-exclusive webinars on their site, provides members with access to healthcare, and has a plethora of resources available for all.

NAVA is also making strides in Washington, D.C. by meeting with policymakers to create AI protection laws. Gilfry, Friedlander and their peers make their voices heard at some of Capitol Hill’s biggest tables, and put officials to task.

In the first of two features, Friedlander and Gilfry discuss the work they’re doing in Washington, break down the current key AI bills in place, the pitfalls of AI, and how actors both on camera and in the booth can shield themselves from nefarious AI practices, including the use of an AI rider for contracts.

Author’s note: To support the voiceover community in Los Angeles, NAVA has set up a California Wildfire Relief Fund where those affected can apply for relief. To donate or request relief, visit NAVA’s Cal-Fire Fund page.


Insights: What You Need to Know About AI in Voiceover

  • Add AI-specific clauses to your contracts to prevent misuse of your voice and likeness for AI purposes.
  • Demand fair compensation and maintain control over the use of your digital clones in any media.
  • Keep up with AI legislation and support efforts to protect personal rights over one’s voice and image.

Tell me about the fAIr Voices campaign just to start it off. What’s the goal? How did it come about?

Carin Gilfry: The fAIr Voices campaign was an initiative that we started to educate the community about AI voices and what we want to see in contracts. The three things that we really want to make sure are in all contracts that deal with AI are synthetic voice production, digital clones and video clones, as well as fair consent, control and compensation.

Consent, meaning that you can’t take my biometric data, my voice or my visual likeness without my permission, and use it for the purposes of AI. Control, meaning that if you create a synthetic version of me with my consent, that I get to decide what that digital version of me does to a certain extent.

It’s not just, “We have your biometric data, we created this voice, and now anyone can do anything with it that we want.” An actor doesn’t want that. We want to have control over what we’re saying and doing because our identity is our brand.

It’s our business. It’s what we make a living from, and it can also go to your personal beliefs. You want to make sure that you are not saying or doing things that you didn’t intend.

And compensation, meaning that if a company is making money from you and your digital clone and digital likeness, that you are being paid fairly.

Being paid fairly—I think it’s still kind of the Wild West out there. There are not really set rates. In our mind, being paid fairly means that both parties agree that the deal that they’ve made is a deal that they want to make, and no one’s being taken advantage of.

NAVA is also going to Washington to help policymakers understand the ramifications of unregulated AI and create bills that we hope will become AI protection laws. What’s the process like? What skills are needed for these conversations and how are the conversations going?

Tim Friedlander: There’s a lot in that one. We’ll work backwards and say that the conversation has been very positive. I think overall, the discussions that we have had have not been very polarizing. There’s not a lot of people who are saying, “Yes, we should have the right to steal people’s voices and do whatever we want to with them.” It’s a very bipartisan discussion in the Capitol. We’ve been on the Senate and the House side, and it’s been positive from both sides, on both sides of the aisle in both houses.

The trouble becomes whether the government can move at a speed that will keep up with technology and get done in time. There’s a very strong possibility that by the time this bill gets in place, the terminology and the technology don’t line up anymore, and we have to rewrite the bill to come up with things that are going to apply to new technology that exists.

As far as the skills needed in there, I think [it’s] just having this belief that we’re going in for the right reason and discussing something that concerns our friends and ourselves personally. I think it’s very easy to do, and I think it’s very powerful.

We’re not walking in saying we’re concerned about something that might happen to somebody else. We’re walking in saying this, something that we’re concerned about, does affect us, we have firsthand experience with how this is going to affect our community and we can speak to this.

The reality of what is happening on a daily basis in the community, versus having to maybe go back and say, “Well, let me go back and check with somebody and see if we can get some information.” We can have that firsthand experience and speak from that in that room, which has become easy.

The three of us—myself, Carin and [voice actor] Matthew Parham—when we go to Capitol Hill, we like talking to people, so it becomes easy to go into these rooms. I think before the first meeting, we were all nervous. And after that first meeting, we were like, “This is easy, we just have to talk and explain and bring up these concerns we have.”

We got into it through some advocacy with one of our board members, who connected us with some people who were able to get us in that room who share the same concerns that we have.

How many bills are actively being worked on with regard to AI regulation at once?

CG: There are a lot. There are several AI legislation trackers that you can look up online to see all of the different bills that are in play right now in the House and Senate. There are two that we’ve been focused on. One in particular looks like it has the most legs, and that is the NO FAKES Act.

The NO FAKES Act was introduced into the Senate on July 31 of this year, and a House version was introduced a few weeks after that. It has a lot of support.

What the NO FAKES Act does, is essentially it gives every single person intellectual property rights over their voice, image, name and likeness, because currently in the U.S. there is no federal law that says you own the rights to your voice, or you own the rights to your name, or you own the rights to your likeness.

In certain states, you’re protected with state laws, and celebrities are a bit more protected than the average person. I think voice actors fall into a kind of gray area because we are in the public all the time.

People recognize our voices maybe, or they recognize our voices as being something familiar, but they are not seeing me walking around going, “My God, there’s Scarlett Johansson—she’s very recognizable as a person.” My voice is maybe recognizable, but my image is not.
There are some protections in place at the state level, but there is currently nothing at the federal level. And that’s what the NO FAKES Act would address.

We also have the bills that Gavin Newsom recently signed that created the AI protections for Californians: AB 1836 and AB 2602. What’s the difference between these bills and the NO FAKES Act, and what role did NAVA play in the creation of this legislation?

TF: NAVA came into this later in the game. Both of these bills were written by SAG-AFTRA, and that association and those people who have been working on this for quite a while. I think some of this early discussion from SAG-AFTRA started as early as 2013 when they started talking about AI. NAVA came in very much to bring our very singular perspective from the voice actor side of things to this conversation.

These two bills are kind of narrow in scope. AB 1836 protects post-mortem rights, [where] either the person or their estate has to provide rights in order to have a voice synthesized. And AB 2602, basically, what it does is it makes a state-level requirement for informed consent or reasonable description of the work you’re going to do with an AI company. Very specifically, that’s the one that we look at that is going to be most relevant for voice actors, and all artists and voice actors in California, or actors who are working with a company that is based in California.

Since some of the major companies are based in California, this is going to have ramifications for most people who work in the United States who work with people in California. It requires reasonable descriptions of the work that is going to be done with your synthesized voice—your voice clone.

[In the past, companies may have said], “We want to make a synthesized voice, but we don’t know how we want to use it yet, so we’re going to ask for all rights.” That can’t happen anymore. They’re going to have to say, “We want to use it not only just for an audiobook, but this audiobook on these platforms for this usage.” That’s going to be a requirement that we’re going to have to make sure that we as voice actors enforce by saying that we have this state bill behind us that requires you to give us more information.

We can’t just do a job that asks for unlimited media rights across all mediums. That’s [not] something that we’re going to be able to allow going forward. That’s something that NAVA is working on. Putting together some information about how to educate voice actors and all artists on how to navigate that. That, I think, is going to be a very strong bill for us going forward.

Newsom also vetoed SB 1047. I wanted to get your thoughts on that.

TF: SB 1047 was the one he vetoed, which was kind of one that we were supporting. It would have required some more enforcement and transparency for the foundational models in the systems. It was generally considered a long shot. We weren’t expecting it to go.

To get passed there would have been very, very strong enforcement for the AI companies, and they pushed back very strongly as well.

What else is being done to protect actors from AI misuse on camera and voice?

CG: One thing that we have been really pushing for is for people to add riders to their contracts, which specifically limit the use of whatever media you generate in your job—in your regular job, not even an AI job—to just that job, and not allow the use for AI.

NAVA has an AI synthetic voice contract rider, which is free for anyone to use. You can download it from our site. It can be amended for on-camera. It can be amended for a lot of different things, actually. The rider basically says that the files that we create, the media that we create for this job, is only to be used for the specific purpose of this job and you, [the client,] may not use my data to make or train a digital version of me. Companies have been adopting this really easily.

I have worked with a couple of companies where they say, “Here’s the contract and it includes this NAVA AI Synthetic Voice Rider,” and I didn’t even have to send it to them first. They’re sending it to me first, which I think is really cool. That’s a level of protection that is really good.

Now, the problem with adding a rider to your contract—if you’re non-union—is that if the company is in breach of that contract, you have to take them to court yourself. What’s great about being a member of SAG-AFTRA and working under a SAG-AFTRA contract is that if the company is in breach of contract, the union will fight for you, and their lawyers will take on the defense of that contract, so that’s not entirely on you.

The strike that’s happening now in video games on the interactive contract is largely, almost entirely centered on AI protections. The TV theatrical strike was also largely, almost entirely centered on AI protections.

SAG-AFTRA made some gains there and we’re hoping the same for the video game strike, but it’s true that probably the safest way to work right now as an actor is under a union contract, which includes AI protections. If you’re non-union, you can add a contract rider, but it means that if they’re in violation of that contract, you have to fight them yourself.

TF: It was written by a lawyer and a contract specialist who put that together to make it as airtight as any contract can be.

It is there for anybody to use. You’re welcome to go on the website and download it. You just download it as is and you fill in the information. You sign it, and you send it over to production [to sign and send back as well].

You mentioned this before with legislation. AI protections seem like they’re moving quicker than other policies that we’d like to see get made, but as you said, it’s not fast enough. There are concerns that the AI companies are going to keep finding loopholes, and we’ll just keep doing this dance of rewriting a bill and playing catch up. How do you combat this?

CG: I think that’s why protecting your individual likeness, your voice image name and likeness, is better than writing a law against a specific form of technology. When you are protecting the person, it doesn’t matter if there are new holographic images that when you walk through a mall, it makes an immediate hologram of you or something like that.

It doesn’t matter how the technology changes, you are protected as a person. I think how we combat the speed of technological advancement is to protect us, instead of making laws about the technology specifically.

Disclaimer: This article is for informational purposes only and should not be considered legal advice. For professional legal assistance, please consult a licensed attorney.

Ready to find your next role with Casting Networks? Sign up for a free trial today!

You may also like: