Friday, March 6, 2026
Men are using Grok to remove women’s Hijabs: Let’s talk about cultural erasure and the automation of Gender-Based Violence


Authors Note:
Lisa Adams is the Founder of Citizen Code, an African–based technology collective building inclusive digital products for social impact. She has led the design of youth-focused platforms, AI-enabled chatbots, and web ecosystems that reach millions of young people across Africa and East Asia & Humanitarian or conflict-affected regions , with a strong focus on gender justice and digital equity.
She frequently speaks and writes on tech-facilitated gender-based violence, feminist approaches to technology leadership, and the intersection of heritage, culture, and innovation bringing her perspective as a technologist and advocate for accessible, community-rooted digital futures.
“Men are now using Grok to digitally remove hijabs.”
I wish that sentence felt exaggerated, but it isn’t.
I used to love Twitter because it was the one platform where I genuinely felt community in the digital space. It felt less polished, less performative, and far less LinkedIn-cringe. At one point I even made a few wonderful hires from my Twitter community. It was unfiltered, sometimes messy, opinionated, and definitely awful from time to time, but it often felt honest in a way that mattered to me.
Lately, I cannot stay there for long. I actually dread logging in.
This is not about some abstract “digital wellbeing.” It is a much more embodied experience. Scrolling now means encountering a relentless volume of image-based abuse, deepfakes, manipulated photos, and cruelty. I close the app, put my phone down, and still feel the continued impact in my body afterwards.
That reaction matters because it comes from years of working inside this harm.
This feeling is not new to me. I have spent my career building technology for young women and girls, advising on technology-facilitated gender-based violence, and helping organisations think through harm, safeguards, and governance. I know how easily these conversations are minimised and how people working in this field often develop coping mechanisms & adopt trauma-informed practices simply to continue doing the work.
Even so, this moment has been difficult to sit with. It took me several weeks to gather my thoughts before writing this.
Reports of Grok being used to digitally remove hijabs from women’s images did not come as a shock to me. What they did was confirm something I have been feeling for a long time.
We are watching gender-based violence, cultural erasure, and Islamophobia become automated, normalised, and defended under the language of “AI experimentation” and “innovation.”
And I am not neutral about that.
This is not a misuse problem.
Let’s be clear: this is not about a handful of bad actors misusing a tool, this is about a design decision.
The real issue is what the tool allows, what it normalises, and what it silently teaches people not to recognise as harm.
As Chayn and its CEO Hera Hussain have pointed out repeatedly, global conversations about image-based abuse remain deeply narrow. They centre nudity, sexualisation, and explicit content, reflecting what is ultimately a Westernised understanding of harm.
But for many women, particularly Muslim women and women outside the West, violation does not begin with exposed skin, let me spell this out:
- Images without a hijab can be devastating.
- Images altered to Western norms can be dangerous.
- And when images are shared without consent they can have serious harm on peoples lives.
These images can sometimes be used to shame, blackmail, punish, and control women, often leading to family violence, social exclusion, loss of employment, and long-term trauma.
BUT because they are not considered “pornographic enough,” they routinely fall through policy gaps. They sit outside of the traditional legal definitions and are frequently ignored by platform safety thresholds and AI governance frameworks.
At some point we have to face the reality that it becomes difficult to call this accidental oversight when the pattern operates so consistently as structural harm.
This is not new.
In a previous Citizen Code Trends piece, “Rebranding, Erasure, and Violence: Krotoa’s Story and the Patterns of Technology-Facilitated Gender-Based Violence,” we explored how harm in digital systems often mirrors much older patterns of erasure. Technology very rarely invents new forms of violence so much as it modernises existing ones. What is happening with Grok reflects this much older history in which women’s bodies, cultures, and boundaries are treated as adjustable when they do not align with the dominant norms.
Read the full piece here: https://www.citizencode.co.za/trends/rebranding-erasure-and-violence-krotoa-s-story-and-the-patterns-of-tfgbv
To understand why this persists, it helps to also remember something uncomfortable about the internet’s origins.
Much of the early commercial web was shaped by industries built around the sexualisation and commodification of women’s bodies. As documented in works such as The Sexual History of the Internet, early online pornography drove key technological developments including digital payment systems, streaming infrastructure, subscription models, and traffic optimisation. The web did not simply and merely host the exploitation of women, but in many ways we could argue that it was partially financed and technically accelerated by it.
That history is very important and matters in the context of technology design today.
When these ‘economic’ foundations of a technology ecosystem emerge in environments where women’s bodies are already treated as content, adjustable, or commodity, those assumptions will inevitably bleed into design norms. They influence what engineers would typically consider as harmless experimentation or which harms are taken seriously enough to regulate.
The result is an internet where manipulation of women’s images is frequently framed as curiosity, humour, or technological capability rather than as an actual violation.
This legacy systematic design harms, continues on to further shape things like datasets and cultural assumptions on which modern AI systems are built. Image models are trained on large quantities of internet material that reflect the same imbalances that have always existed online. When those datasets are heavily indexed on the sexualisation, surveillance, and manipulation of women’s images, the systems built on top of them inevitably inherit those patterns.
I guess we could say that our technology's harms today are simply a statistical reflection of a long history of exploitation.
With all my frustration said, Grok’s ability to remove hijabs from women’s images does not look like a strange edge case to me but is the continuation of a long-standing pattern in which technological innovation advances faster than our social willingness to recognise harm.
So yeah, sadly this moment is not surprising at all, it is just devastatingly familiar.
Can ‘colonial’ assumptions now run at scale?
I spend a great deal ( a very great deal) of time thinking about what it actually means to decolonise technology systems. Not as a sexy buzzword, but as my greatest and most practical design challenge.
What Grok also exposes is just how deeply colonial assumptions also remain embedded in modern AI systems. When a model treats the removal of a hijab as a “cosmetic edit” rather than as a violation, it is making a value judgement about what counts as harm, whose boundaries matter, and which cultural norms are treated as default.
Again, this logic is definitely not new, it is colonial logic and AI does not often challenge this dynamic but it does seem to be automating it.
For those of us building technology in the Global South, in Muslim communities, and in contexts where harm does not resemble what Silicon Valley expects it to look like, this pattern is painfully familiar. Harm that does not align with Western definitions is often just minimised, dismissed, or reframed as an “edge case”.
This is a governance failure.
If AI systems are being deployed globally, then global definitions of harm have to be inclusive and shape how those systems are designed.
That requires designing with humility, cultural proximity, and genuine engagement with organisations like Chayn and with the many researchers and survivors who have been naming these harms for years.
These challenges raised is what happens when powerful technologies are deployed inside systems that still treat some cultures, identities, and boundaries as optional and for survivors of gender-based violence, these are not abstract harms. They accumulate through repeated signals to us that women’s bodies and identities are negotiable, editable, or open to experimentation.
That logic does not remain online only it travels into families, workplaces, and communities.
I have found myself disengaging from platforms more often, not out of fragility but because my nervous system recognises patterns it has learned to associate with risk. That response is informed both by lived experience and by years of working in this field.
And yes, I am angry.
- I am angry at Elon Musk.
- I am angry at the arrogance that frames these outcomes as “technological progress”.
- I am angry at how easily women’s lives are deprioritised while men debate hypotheticals and edge cases.
If we are serious about building responsible AI systems, the work cannot begin after the harm occurs. It must begin with acknowledging the histories, assumptions, and power structures already embedded in the technologies we build.
If your organisation is thinking about how to build digital products responsibly, especially in contexts where technology intersects with gender, culture, and safety, Citizen Code works with partners to design and govern technology systems that are both accessible and accountable.
We support organisations through technology strategy, research, and development of digital platforms that prioritise safety, cultural context, and real-world impact. If you are exploring how to build more responsible digital ecosystems, we would be happy to connect on info@citizencode.co.za
Share this
Blog
Latest read.
Join us in building digital solutions as a foundation for lasting change.


