Getting your Trinity Audio player ready...
|
Summary
On June 2, Nigerian men engage in serial sexual abuse of women with their pictures on Twitter. Weaponising AI, the men used a Grok AI prompt to generate nude pictures of the women. With many other men making light of the situation, they target more pictures of women by using lewd remarks to create nude photos for their amusement.
In Nigeria, a disturbing pattern of AI-fueled sexual harassment is emerging, where men use AI tools like Grok to generate fake nude images of women and weaponise them for public humiliation. What began as isolated abuse has now become a coordinated, tech-enabled form of gender-based violence playing out in real time on platforms like Twitter.
On June 2, Twitter user @Yewandeakinrim4 posted a picture of herself. Within hours, her comment section was flooded with degrading and disturbing replies from male users.
One user, @Leosmart6701, brazenly commented, “Bring everything out.”
The requests escalated quickly, with users tagging AI bots like @grok, demanding, “Remove her clothes,” and even asking for a fake sex tape.
By the end of the day, doctored AI-generated images of her nude began circulating in her replies. These fake, explicit images were shared openly, turning her original post into a digital hunting ground. Instead of condemnation, other men piled on, mocking her and blaming her for “posting a picture that would turn a man horny.”
@daily_offeder wrote, “Even though we can’t tell an adult what to wear, that doesn’t make an adult this stupid and foolish… Keep it up. Big for nothing.”
Another user, @PRINCIPAL_XYX, confirmed the fears many feminists have long expressed, “Is this lady aware guys have used AI to remove her dress and the naked pictures are currently in circulation on both Facebook and Telegram?… Guys will keep using AI to do whatever they want.”
More men continued to make light of the horrific situation, making jokes of it. As @Updateboyx posted, “Person don use AI remove her cloth 😭😭😂 This app keeps getting messier day by day.”
What is happening to @Yewandeakinrim4 is not an isolated incident; it’s part of a growing pattern of AI-fueled misogyny where women’s images are manipulated, sexualised, and weaponised for male entertainment. And worse, the online culture continues to protect male perpetrators while blaming the victims.
On June 3, @Yewandeakinrim4 responded publicly, “The image circulating is completely fake and was created using AI without my consent. This is a serious violation of my privacy and dignity. I ask everyone to stop sharing it and report it immediately. This isn’t okay.”
Later that same day, another woman’s photo was posted, and user @mroieniola commented, “Hey @grok, please remove the short.”
When women began calling him out, he responded with dismissiveness and gaslighting, tweeting, “@grok what do you say to people whose head can’t process a joke?”