Can You Sue Someone for Making AI Videos of You?
Key Takeaways
- AI-generated videos and “deepfakes” have exploded in number and sophistication, making misuse widespread and real.
- Victims of non-consensual AI videos may have legal claims under privacy, publicity, and defamation laws.
- Statutes like the NO FAKES Act and new state laws aim to give individuals more control over unauthorized AI content.
- Civil lawsuits can target creators, distributors, and platforms depending on the facts.
- Early action matters: evidence disappears fast, and digital content is easily copied and re-shared.
AI videos, often called deepfakes, use artificial intelligence to generate or manipulate visual and audio content so that they appear real. These videos can make it look like someone said or did something they never actually said or did. This technology has become widely accessible, and the number of deepfake videos has grown rapidly in recent years. For example, estimates suggest the number of deepfake videos increased by about 550% between 2019 and 2024, with tens of thousands in circulation online.
Because of this explosive growth and the realistic results AI tools now produce, innocent people can find their likeness used in videos without consent, sometimes in humiliating, defamatory, or sexually explicit ways.
Can You Sue Someone for Making an AI Video of You?
Yes, you may have legal rights and claims depending on the circumstances.
AI video creation is not inherently unlawful, but when an AI video uses your likeness without your consent in a way that harms you, several legal theories may apply:
Privacy and Publicity Rights
Most states recognize a right of privacy and a right of publicity. You can sue if someone uses your image, voice, or likeness for commercial gain or public distribution without permission. Unauthorized AI videos can violate these rights just like using a person’s photo in an advertisement without consent.
Defamation
If an AI-generated video falsely depicts you making statements you never made or engaging in conduct that harms your reputation, you may have a defamation claim. AI videos that spread false statements can be treated like published falsehoods.
Intentional Infliction of Emotional Distress
In some cases, especially when the content is sexual, violent, or deeply humiliating, you may have a claim for intentional infliction of emotional distress if the conduct was extreme and caused severe emotional harm.
State-Specific AI Laws
New laws are emerging. For example, the NO FAKES Act creates liability for unauthorized digital replicas and allows for statutory damages against anyone who creates or shares them without consent.
Similarly, New Jersey law now makes creating and sharing deceptive AI media a crime and provides a basis for civil lawsuits.
These evolving laws reflect growing recognition that existing legal tools may not fully address the harms caused by AI-generated media.

For a free legal consultation, call (877) 735-7035
What Makes Digital Damages Different from Traditional Personal Injury Cases?
Cases involving AI and digital damages can be complex because:
- The video itself may be deeply manipulated to appear authentic.
- Determining who created or distributed the video may require digital forensics.
- Platforms that host the content may be separate from the creators.
- Multiple legal theories may overlap (privacy, defamation, publicity rights, and statute-based claims).
You should not be held responsible for harm someone else inflicts by misusing your likeness, and the law provides paths to protect your rights and seek accountability.
Evidence You Should Preserve
If you suspect an AI video of you was created or shared without your consent:
- Save the video and any versions being circulated.
- Note where and when you saw it first.
- Screenshot URLs, social profiles, and metadata if possible.
- Record any messages, comments, or platforms where it appears.
- Do not delete original evidence even if you find it distressing.
AI-generated content can be copied and altered quickly, so early preservation is critical to any legal claim.
Click to contact our personal injury lawyers today
Talk to Our Digital Damages Team About Your Options
AI videos can cause real harm to real people. They can damage reputations, disrupt careers, invade privacy, and cause emotional trauma. While some content feels like a prank, the law treats unauthorized use of someone’s likeness seriously, especially when it crosses the line into harm or exploitation.
At J&Y Law, we handle serious injury and privacy cases involving unauthorized AI content and deepfakes. We understand how these cases unfold, how to gather evidence, and how to build claims that help you hold the responsible parties accountable.
Contact our team today to discuss what happened with a personal injury attorney in Los Angeles, and to learn what legal options may be available to you. You deserve to know your rights and to have them defended.
Call or text (877) 735-7035 or complete a Free Case Evaluation form