When AI mirrors misogyny and the internet stops loving you back

author-image
Smrithi Mohan
New Update
AI women safety

Content creators Amala Paramal, Prity Darjee, Sharanya Nambiar and Devika Gupta talk about being female creators online and dealing with the dehumanisation of women through AI sexualisation and deepfake culture.

We are a society where AI is either fully embraced or hated enough that we wish it never existed. And then there are people like me, caught somewhere in between. Ethically conflicted, yet complicit because of how difficult it is to ignore an AI bot. Most of this ethical dilemma also comes from knowing how humans have managed to make the worst of this technology. Case in point, the entire sting around people using AI to sexualise women with a single prompt, or even using an artist’s cover of ‘Just a Boy’ and recreating it with hyper-realistic AI-generated women. 

Out of curiosity and as someone who often finds herself asking for some assistance, I asked ChatGPT what it thinks of people using AI to sexualise women online. The result? A surprisingly human response from a platform that is made of binary codes. For something that lacks emotions and empathy, it said, “Short answer? It’s deeply unsettling — and also very revealing. What bothers me most is how dehumanising it is. These women — real or imagined — stop being people and become prompts.” 

For a society that considers itself to be progressive and one that ‘knows better’, our idea of what women are supposed to be is always limited. On one hand, we have women who feel it is liberating to express themselves online; on the other hand, we come across people who are always dismissively joking about this presence. From boomerang stories of their brunch outings or their GRWM videos, people making fun of it feels nothing short of misogyny with a mic that couldn't care less about a woman’s love for what they are creating. As an Indian trans woman, Amala Ajithkumar Parammal has never seen content creation as a casual act, but as a survival. “Content has had a large impact on my personal journey of understanding and making peace with my trans feminine identity.” What once felt like an isolated struggle became a collective one through creators who came and made a safe space for her. “For me, it was my time to do the same.” Meanwhile, Prity Darjee, who went from secretly filming dance videos on her father’s phone to mimicking Bollywood characters, creation was never calculated. “This has always been my purest, most effortless form of love.”

Content creation itself is a love story of expression, visibility, and connection. However, being in the public eye now comes with new threats. What once revolved around negative comments has shifted to being concerned about AI, a technology intended to assist humans. There has been a rise in incidents involving AI morphing, where generative tools like Grok are used to create non-consensual sexually explicit content. This has introduced a fear among creators who simply want to share their authentic selves online. Existing as a woman in itself is a struggle these creators have to go through, and that is only amplified.

It’s a shift Divya Gupta felt firsthand. What started as casual reels for Divya soon became a balancing act between passion and responsibility. The rise of AI has made her more cautious. “Your digital identity isn’t fully under your control anymore, and that’s scary.” Sharanya, on the other hand, finds herself with more questions on how to be safe. “How do you even save yourself from that? You never know which video of yours will end up being used, and how. So obviously there’s this fear in your mind as a creator when you’re putting out a video that you don’t want to be, you know, used against you or even in a way that it was never intended to be.” She thinks this fear is also a valid reaction that people are bound to feel. 

Amala spoke from her experience of being a trans woman in India. As part of a community that has always been the easy targets for institutions and folks to further their cowardly ideas by inciting fear and sensationalising and pathologising their lived experiences, she shares how living in fear of “erasure” has become a muscle memory. “Now watching the larger majority fearing a similar 'erasure' is in fact an interesting visual as a trans woman and a content creator.”

Despite being created to gather information and draw inspiration from real individuals and their thoughts, it is concerning to think about the type of content being produced. What could have been used for creative exploration is too often exploited to defame individuals for amusement. Amala, who strives to remain optimistic about AI’s potential, firmly believes in looking at technology and art differently. “Art is where I draw the line,” as she highlights, not only the psychological but also the environmental costs involved. While she is aware that AI models can feed on her work, she finds security in her creative process. Her work is shaped by lived experience. Meanwhile, Sharanya expresses how witnessing peers have their work stolen or misused has been sobering. “You never know which of your videos might be used inappropriately.” The misuse of AI, particularly through deepfakes, feels especially alarming, not only as a creator but also as a woman. “You feel utterly helpless when situations like this arise.”

And perhaps that is the most unsettling part of all. You realise that AI did not invent misogyny but simply mirrors it. The problem is not with the technology itself but with the painfully human ethics. Across these voices runs a shared truth: what AI cannot erase is the reason women create. It can also not replicate our joy, vulnerability, lived experience, and connection. AI may master imitation, but the source will always be human.

What are your thoughts on AI being misused and its impact on women's safety online? Let us know in the comments below.

This article was first published in the Social Ketchup Magazine's February 2026 edition.

For more such content, follow us @socialketchup.

online Women creators online women safety AI sexualisation AI sexualisation and deepfake culture