“A deepfake audio was created using your voice. It sounds exactly like you announcing an endorsement for a casino. You never consented. The video has 10 million views. You have no clear legal recourse—until now.”
The ELVIS Act (Ensuring Likeness, Voice, and Image Protection of Publicized Individuals Act) was passed in 2024 and signed into law in 2025. It represents the first major federal legislation protecting voice and likeness rights from synthetic media (deepfakes and AI-generated replicas).
Before the ELVIS Act, voice protection was fragmented: some states had “right of publicity” laws, some had personality rights statutes, and others had nothing. A deepfake could be created in one jurisdiction with few legal consequences, then distributed globally. Creators and celebrities had limited recourse.
Voice rights are now a critical legal protection. The ELVIS Act establishes federal standards, creates causes of action for unauthorized synthetic voice use, and provides damages and injunctive relief. But the law leaves many questions unresolved: Does it apply to parody? What about accidental infringement? How do you prove infringement?
This guide explains the ELVIS Act, how it protects your voice, state-level voice rights, enforcement mechanisms, damages, and how to protect yourself from unauthorized synthetic voice use.
1. What Is the ELVIS Act? (Effective 2025)
The Statute
The ELVIS Act amended federal law (15 U.S.C. § 1125) to add protection for synthetic media—specifically unauthorized use of a person’s voice and likeness in digitally created content. Key provisions:
- Unauthorized synthetic voice use is unlawful: Creating or distributing a “digital replica of the voice” of a person without consent is a violation of the right of publicity.
- Federal cause of action: You can sue in federal court for unauthorized synthetic voice use.
- Damages: Actual damages, profits from infringement, or statutory damages ($2,500-$25,000 per violation).
- Injunctive relief: Courts can order removal of the deepfake or prevent its distribution.
- Applies to deceased individuals: Heirs or estates can protect the voice of deceased celebrities.
What It Doesn’t Protect
The ELVIS Act includes exceptions (similar to right of publicity laws):
- Authorized use: If you consented to the synthetic voice use, it’s not a violation.
- Parody & satire: Mimicking your voice for comedy is protected speech (First Amendment).
- News & journalism: Using synthetic voice for news purposes is likely protected.
- Incidental use: Voice used in context not associated with commercial endorsement or commercial benefit may not be a violation.
- Public domain speech: If the synthetic voice is used to say something you would likely support, it may be protected as political speech.
2. The ELVIS Act vs. State Right of Publicity Laws
Before the ELVIS Act, voice protection was handled by state laws. Now federal law provides a baseline, but states also have their own protections.
| Jurisdiction | Voice Protection Standard | Damages Available | Key Differences |
|---|---|---|---|
| ELVIS Act (Federal) | Unauthorized synthetic voice use (digital replica) | Actual damages, profits, or statutory $2,500-$25,000 per violation | Applies nationwide; covers deepfakes and voice cloning; First Amendment exceptions |
| California | Right of publicity (including voice). Civil Code § 3344 | Actual damages, profits, or statutory $150-$750 per unauthorized use | Very strong protection; includes deceased celebrities; broader than ELVIS Act |
| New York | Right of publicity (voice included). Civil Rights Law § 50-51 | Actual damages or profits from use | Covers voice; applies to deceased for 40 years; narrower exceptions than California |
| Texas | Personality rights law (includes voice). Business & Commerce Code § 26.01 | Actual damages, profits, or statutory up to $100,000 | Strong protection; covers deceased for 50 years; applies to athletes and entertainers |
| Most Other States | No specific statutory protection; common law right of publicity may apply | Varies; typically actual damages only | Weak protection; must prove commercial misappropriation; voice may not be covered |
How to Use Both ELVIS Act and State Laws
You can sue under both the ELVIS Act (federal) and your state’s right of publicity law (state). This gives you multiple causes of action and potential damages from both sources. Example: You sue in California under both the ELVIS Act and California’s § 3344 right of publicity law. If you win, you could recover statutory damages under both statutes.
3. How to Prove an ELVIS Act Violation
The Legal Standard
To win an ELVIS Act claim, you must prove:
- The defendant created or distributed a “digital replica of the voice.” This means a synthetic or deepfake audio that realistically imitates your voice. Courts are still defining what counts as “realistic.”
- You did NOT authorize the use. You must prove lack of consent (can be written or verbal, but written is stronger).
- The use was commercial or implicated the right of publicity. Example: An endorsement, advertisement, or use suggesting you made a statement.
- The use caused damages. You can prove actual damages (lost endorsement deals, reputation harm) or rely on statutory damages.
Evidence You Need
- The deepfake audio file: Screenshot, download, or video of the synthetic voice use
- Voice analysis: Expert testimony that the audio is a digital replica (not your actual voice recorded)
- Proof of lack of consent: Your testimony that you never authorized this use; emails showing you objected
- Damages evidence: Lost deals, negative publicity, emails from clients saying they won’t work with you due to the deepfake
- Distribution evidence: Views, engagement, reach of the deepfake (shows harm)
Burden of Proof
The burden is on YOU to prove the violation. This is challenging because:
- Voice analysis is not always definitive (AI can create convincing replicas)
- Proving damages is difficult (how much did you lose?)
- Finding the defendant can be hard (anonymous accounts, overseas creators)
Exception: Statutory damages ($2,500-$25,000) are available even without proving exact financial harm. This makes the ELVIS Act easier to use than traditional right of publicity claims.
4. Types of Synthetic Voice Violations Under ELVIS Act
1. Voice Cloning for Endorsements
A company creates a digital replica of your voice advertising their product without permission. You never made the endorsement. You can sue for statutory damages and profits from the endorsement deal.
2. Deepfake Audio for Defamation
Someone creates audio that sounds like you confessing to a crime or making a racist statement. It spreads online. You can sue under ELVIS Act for unauthorized voice use + potentially sue for defamation/false light.
3. Voice Cloning in Entertainment
A studio creates a synthetic version of your voice for a video game, movie, or audiobook without permission. This is a commercial use of your voice without consent. ELVIS Act violation.
4. Political Deep Fake Audio
A deepfake audio suggests you made controversial political statements before an election. This is murky legally—ELVIS Act protects voice, but political speech gets First Amendment protection. Still actionable if clearly false and damaging.
5. Voice Cloning in Scams
Scammers use your voice to convince people to send money (“This is me, I need $10,000 sent immediately”). ELVIS Act violation + fraud. You can sue and get law enforcement involved.
6. Parody (NOT a Violation)
A comedy show does a funny impression imitating your voice. Even if it’s a deepfake, this is protected parody/satire. First Amendment defense. ELVIS Act doesn’t apply.
5. Damages & How to Enforce ELVIS Act Rights
Available Damages
- Statutory Damages: $2,500-$25,000 per violation (determined by court based on factors like willfulness, reach, harm)
- Actual Damages: Provable financial harm (lost endorsement deals, lost income)
- Profits from Infringement: If the deepfake generated revenue (ads, views), you can recover those profits
- Injunctive Relief: Court orders removal of the deepfake, prevents further distribution
- Attorney’s Fees: If defendant acted willfully, you may recover legal costs
Enforcement Steps
Step 1: Cease & Desist Letter
Send a formal demand to the creator/distributor to remove the deepfake. Many will comply without litigation. Your attorney can send this.
Step 2: Platform Removal
File a DMCA takedown notice with the platform hosting the deepfake (YouTube, TikTok, Twitter, etc.). Most platforms have deepfake removal policies.
Step 3: Federal Lawsuit
If the deepfake remains, file a federal lawsuit under the ELVIS Act. Claim statutory damages ($2,500-$25,000 per violation). This forces the defendant to answer or pay.
Step 4: Criminal Referral
If the deepfake is used for scams, extortion, or defamation, refer to law enforcement. Some deepfakes may violate criminal fraud or identity theft statutes in addition to ELVIS Act.
6. Red Flags & Legal Exceptions to ELVIS Act
Red Flag #1: Parody Is Protected.A deepfake mimics your voice for a funny sketch. Even if it’s a perfect synthetic replica, it’s protected parody/satire under the First Amendment. ELVIS Act doesn’t apply.
Red Flag #2: News & Journalism Exception.A news outlet uses your synthetic voice to explain a complex topic (e.g., AI news coverage). This may be protected journalism. ELVIS Act claim likely fails.
Red Flag #3: Consent Defense.You authorized the voice clone for one use, but it was used for another. The defendant claims you consented to voice cloning technology. Courts will look at the scope of consent—limited consent may not cover all uses.
Red Flag #4: Unintentional Infringement.Someone unknowingly trained an AI on your voice without knowing it would be used commercially. ELVIS Act still applies—knowledge/intent isn’t required. Defendant is still liable.
Red Flag #5: Speech vs. Voice.The ELVIS Act protects “voice” (the sound), not the words. If a synthetic voice says something you wouldn’t say, both ELVIS Act and defamation claims may apply. But pure speech is harder to litigate.
Red Flag #6: Deceased Celebrities.The ELVIS Act protects deceased individuals’ voices for 70 years after death (or per state law, whichever is longer). Heirs can sue. This is powerful for protecting iconic voices (Elvis, Frank Sinatra, etc.).
Red Flag #7: International Deepfakes.A deepfake is created overseas and distributed globally. U.S. courts may not have jurisdiction. You’d need to sue in the country where it was created or use international IP treaties.
7. How to Protect Your Voice Rights
1. Monitor Your Voice Online
Set up Google Alerts for your name + “deepfake,” “synthetic voice,” “AI voice.” Act quickly if a deepfake appears.
2. Document Your Voice
Register your voice with voice analytics services. Some companies (e.g., voice forensics firms) can create a “voice fingerprint” proving audio is/isn’t you.
3. Use Contracts to Restrict Voice Cloning
In talent agreements, music contracts, and endorsement deals, explicitly prohibit voice cloning or synthetic voice use without separate written permission. Example clause:
“Talent’s voice and likeness are licensed only for the Uses defined in this Agreement. Company shall not create, authorize, or permit the creation of a digital replica, deepfake, or synthetic version of Talent’s voice without Talent’s written consent and additional compensation.”
4. Copyright Your Voice Recordings
Register your voice recordings with the U.S. Copyright Office. This protects against unauthorized copying of your actual recordings (separate from voice cloning rights).
5. File DMCA Takedowns Aggressively
When you find a deepfake, immediately file DMCA notices with the hosting platform. Most will remove it within 24-48 hours.
6. Get Voice Insurance
Some entertainment insurance policies now cover synthetic media infringement. Check if your current policy covers deepfakes and voice cloning.
7. Preserve Evidence
If you find a deepfake, screenshot it, download the file, and document the date/time. This evidence is crucial if you sue later.
8. Know Your State’s Laws Too
Use both the ELVIS Act and your state’s right of publicity law. California, New York, and Texas have stronger protections than the ELVIS Act alone.
8. FAQ: ELVIS Act & Voice Rights
ELVIS Act Is Game-Changer for Voice Rights
Voice rights have entered the mainstream legal framework. The ELVIS Act represents the first major federal protection against synthetic voice infringement, giving creators, celebrities, and public figures a clear cause of action and meaningful damages.
Key takeaways:
- Unauthorized synthetic voice use is now federally illegal (ELVIS Act, effective 2025)
- You can recover statutory damages of $2,500-$25,000 per violation without proving exact financial harm
- Parody and news journalism are protected exceptions
- Combine ELVIS Act claims with state right of publicity laws for broader protection
- File DMCA takedowns immediately when you find deepfakes
- Use contracts to explicitly prohibit voice cloning without consent
- Protect deceased celebrities’ voices for 70 years after death
The technology for creating perfect voice clones is improving rapidly. The ELVIS Act gives you legal recourse when deepfakes happen. Use it.