Megan Thee Stallion was the target of a sexually explicit deepfake. It’s a huge problem.

Megan Thee Stallion is the latest celebrity to be targeted by a sexually explicit deepfake created without her consent – ​​highlighting just how widespread this form of abuse is becoming.

Deepfakes are videos that often use AI to superimpose a person’s face onto another body (real or fictional), making it appear as if they are performing the actions of the person in the clip. They can include anything from clips that give the impression of a politician giving an interview they’ve never done, to sexually explicit, non-consensual videos that swap in people’s faces.

The latter has become an increasingly common form of sexual abuse, with new apps emerging that allow people to make clips of others they know. As Cleo Abram noted in a Vox video in 2020, “The most pressing threat from deepfakes is not politics. It’s porn.”

Last week, a deepfake of Megan Thee Stallion with sexually explicit images circulated on X. According to NBC News, the video has been viewed tens of thousands of times and posted by multiple accounts. A spokesperson for

“It’s really sick how y’all go out of your way to hurt me when you see me win,” Megan Thee Stallion wrote in a statement on X on Saturday. “Y’all are going too far, Fake ass shit.”

Deepfakes are incredibly harmful to those targeted and are difficult to fix because even if they are removed, the damage is already done. Following the X-post on Saturday, Megan Thee Stallion got emotional at a concert and cried while singing “Cobra,” a song that touches on issues related to mental health. (She didn’t acknowledge the topic during the show.)

Megan Thee Stallion is among a growing list of prominent women who have been victims of such violation – and are speaking out against it. Her experience highlights the scale of the problem, and the potential it has to harm even more people as the tools that make this possible become more widely available.

Deepfakes are a growing form of abuse

Megan Thee Stallion’s experience points out how deepfakes have been used as a weapon in recent years, including against other celebrities such as Taylor Swift, but also against private individuals. As cybersecurity firm DeepTrace discovered in 2019, 96 percent of deepfake videos on the internet were pornographic, and nearly 100 percent of these were by women.

“Deepfake sex videos tell individuals that their bodies are not theirs and can make it difficult to stay online, get or keep a job, and feel safe,” said Danielle Citron, a law professor at Boston University. the DeepTrace report.

Not only are these videos traumatic when they appear, but the consequences can haunt women for years, affecting their reputations and mental health. Such abuse, like so-called revenge porn – a form of abuse in which nude photos of women are posted without their consent – ​​is demeaning and aimed at taking away their power. “Deepfake sexual abuse is mostly about trying to silence women who speak out,” Clare McGlynn, a law professor at Britain’s Durham University, told Glamor.

Before that, Megan Thee Stallion had already condemned the actions of rapper Tory Lanez, who was convicted of shooting her in the foot. She also bore the brunt of numerous attacks from men who questioned her story and undermined her experiences in the years since.

As Vox’s Anna North has reported, the prevalence of such deepfakes is only expected to increase as AI technology becomes more common and easy to use. In some cases, mobile apps have even allowed high school students to create non-consensual sexually explicit deepfake images of their classmates. Online creators are now also offering custom deepfakes for people who want to create these videos and images of famous stars or individuals they know.

A worrying aspect of these acts is the limited options people have to combat them. In the case of Megan Thee Stallion, X has been active in removing videos, although that hasn’t always been its approach, even with other prominent figures, NBC News reports. Furthermore, North writes, federal efforts to pass a law banning such deepfakes are still ongoing, and more accountability from tech companies is desperately needed to truly combat this problem.