Megan Thee Stallion calls out sickening deepfake video

This article refers to assault, image-based abuse and suicide.

Rapper Megan Thee Stallion has spoken out after a sexually explicit, AI-generated video using her likeness was shared on social media over the weekend.

“It’s really sick how y’all go out of your way to hurt me when you see me win.” Megan posted on X, referencing the video. “You’re all going too far, Fake ass ***. Just know that today was your last day playing with me and I mean that.

Per nbc news, there were at least 15 posts on

A spokesperson for

Megan fought back tears on stage as she performed her song “Cobra” during her “Hot Girl Summer Tour” Tampa date later that same day. The emotional song details her struggles with her mental health and thoughts of suicide after the loss of her parents and grandmother and surrounding the Tory Lanez trial.

Megan has been continuously harassed online since 2020, when she first accused rapper Tory Lanez of shooting her in the foot. The incident sparked fierce debate online and Megan was subjected to widespread misogynistic hate and death threats.

In a statement to the court during the subsequent trial, Megan stated that she had not experienced “a single day of peace” since she was “mercilessly shot.”

Lanez has since been convicted and sentenced to ten years in prison for the shooting and for three felonies: assault with a semiautomatic firearm, having a loaded, unregistered firearm in a vehicle and discharging a firearm with gross negligence.

But that hasn’t stopped Megan from continuing to release music and continue her activism. In a message to her fans ahead of Cobra’s release in November, she said: “Cobras are an example of courage and self-reliance. They stand strong and fierce in the face of challenges and teach one to tap into their inner strength and rely on themselves to overcome their threats.”

Megan isn’t the first famous woman to fall victim to this sickening content.

In January, Taylor Swift was also targeted, with her face artificially mapped onto images showing her being assaulted during non-consensual sex acts. One photo of Swift was viewed 47 million times before it was deleted.

The capabilities of AI technology are becoming a major concern for women around the world. GLAMOUR’s 2023 Consent Survey shows that 91% of our readers think deepfake technology poses a threat to women’s safety.

GLAMOR has previously campaigned for better legislation around deepfake technology, with the Department of Justice promising to criminalize the creation and distribution of AI-generated and sexually explicit deepfake videos. However, the timing of the general election has created uncertainty as to whether this legislation will be enforced by the next government.

There are also limited laws about image-based abuse in the US. For example, USA today found that only ten states are known to have laws regarding deepfake videos and images. Because U.S. privacy laws vary by state, there are significant gaps in the legal system surrounding the prosecution of those who create and distribute this content.

Social media companies must urgently tackle the creation and sharing of such harmful content and work with governments around the world to stop the spread of this brutal misogyny. If it can happen to people like Megan Thee Stallion and Taylor Swift; it can happen to anyone.

The Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 living in the UK. You can call them on 0345 6000 459.