Megan Thee Stallion Files Lawsuit Over AI Deepfake Distribution
by Oliver Dale · BlockonomiTLDR
- Megan Thee Stallion filed lawsuit against influencer Milagro Gramz over defamation and AI deepfake porn
- Lawsuit alleges Gramz shared and promoted AI-generated explicit content without consent
- Case connected to ongoing feud stemming from 2020 Tory Lanez shooting incident
- Robert Downey Jr. threatens legal action against AI digital replicas of his likeness
- Similar cases include Scarlett Johansson’s lawsuits against Lisa AI and OpenAI
The entertainment industry is seeing a wave of legal actions against unauthorized AI-generated content, with rap artist Megan Thee Stallion leading the latest charge in a lawsuit filed Tuesday in Florida’s Southern District Court.
Megan Pete, known professionally as Megan Thee Stallion, has taken legal action against Texas-based internet personality Milagro Cooper, who goes by Milagro Gramz online.
The lawsuit centers on allegations of harassment, defamation, and the spread of AI-generated deepfake pornography featuring Pete’s likeness.
Court documents reveal that Cooper, who has 27,000 followers on X (formerly Twitter), allegedly promoted explicit AI-generated content featuring Pete.
While the creator of the deepfake video remains unknown, Pete’s legal team argues that Cooper actively encouraged her followers to view the unauthorized content.
The lawsuit states that Cooper operated what Pete’s attorneys describe as an “online rumor mill.” Beyond sharing the AI-generated content, Cooper allegedly spread false claims about Pete’s mental health and claimed the rapper had a “severe drinking problem.”
This legal battle has roots in a broader conflict involving Canadian rapper Tory Lanez. The situation began in 2020 when Lanez was charged with shooting Pete in the feet during an altercation in Los Angeles.
Lanez was later convicted on three felony counts, including assault with a semiautomatic handgun.
Pete’s legal team is pursuing both compensatory and punitive damages. They’re also seeking legal fees and a court order to prevent future harassment by Cooper.
When informed of the lawsuit by Pete’s attorney Alex Spiro, Cooper acknowledged the situation on X, stating, “Of course, we’ll chat about it. They threw in the tape, too.”
The lawsuit reflects growing concerns about AI technology’s potential for misuse in creating unauthorized content. Similar cases have emerged across the entertainment industry, with other high-profile figures taking stand against AI replications of their likeness.
Actor Robert Downey Jr. recently addressed the issue of AI-generated content, stating he would “sue all future executives” who attempt to create digital replicas of his likeness without permission. This statement came during a discussion about the potential use of AI to resurrect his portrayal of Tony Stark in Marvel films.
Actress Scarlett Johansson has already taken legal action in two separate cases involving AI technology. She filed a lawsuit against image generator Lisa AI for creating and using a deepfake of her in promotional content.
Additionally, in May, Johansson pursued legal action against OpenAI over a voice-enabled chatbot that allegedly mimicked her voice without authorization.
The technology industry’s rapid advancement in AI capabilities has created new challenges for celebrities and public figures trying to protect their image and likeness rights.
These legal actions represent attempts to establish boundaries around the use of AI-generated content.
Representatives for both Cooper and Pete have not responded to requests for comment about the ongoing legal proceedings. The case is currently pending in the Southern District Court of Florida.
The rising number of lawsuits related to AI-generated content suggests this is becoming a pressing issue in the entertainment industry. As AI technology continues to evolve, legal frameworks are being tested and established through these cases.
Pete’s attorneys emphasize in the court documents that Cooper’s alleged harassment “knows no bounds,” highlighting the personal impact of unauthorized AI-generated content on public figures.
The case also brings attention to the role of social media platforms in the spread of AI-generated content. Cooper’s alleged use of YouTube and X to distribute and promote the unauthorized content demonstrates the multiple channels through which such material can be shared.
This lawsuit adds to the growing body of legal challenges surrounding AI-generated content, particularly regarding consent and image rights in the digital age.