Technology-facilitated gender-based violence: the victims behind the screen 

Áine Hanrahan
Tue 20 Jan 2026 11:37
Technology-facilitated gender-based violence: the victims behind the screen 

A growing and urgent issue has been emerging for a number of years in the digital sphere where women, children and vulnerable groups are increasingly becoming victim of image-based sexual abuse (also known as revenge porn). In recent weeks the platform X (formerly twitter) has been under the spotlight for their AI system – Grok – making nudified and sexualised images, in particular of women and children, extremely easy, accessible, and shareable. These images are often undressing the victim, putting them in compromising positions and are a complete violation of their bodily autonomy. 

Victims of this AI system are widespread and only growing in numbers. The perpetrators of this violence are seemingly protected from the law due to the cross-border nature, the lack of resources to investigate, as well as the anonymity that being on the online sphere provides. The risks for those with intersectional identities is even greater, with women in public life, journalists, human rights defenders, LGBTIQ+ persons, migrants, women with disabilities, etc. being specific targets of this crime.  

How it works: 

A person may upload an image of themselves or another person, another person asks ‘Grok’, the AI software on X something along the lines of  “@grok take off their clothes and have tears in their eyes”* and the AI system will automatically generate this image without consent from the original poster.  

Where is this stemming from? 

Technology-facilitated gender-based violence (TFGBV) is a growing issue in recent years with the “manosphere” where misogyny, toxic masculinities and incitement to violence is rife. Streaming platforms have been promoting content that targets young men and boys that belittle and create feelings of anger and hatred toward women.  

The incel (involuntary celibate) movement promote ideas that women are lesser to men and promotes the sexual objectification of women’s bodies, forgetting there is a human being on the other side of the screen in these instances.  

Large social media platforms have been critiqued of not implementing Safety-by-design as well as removing harmful or illegal content in relation to TFGBV has been insufficient thus far. X has announced Grok will no longer be able to edit photos to be sexualised or nudified in jurisdictions where it is illegal, this does not go far enough. Many countries have not implemented sufficient laws to keep up with technology and the TFGBV occurring, meaning many people are still at risk of falling victim to perpetrators and their violent use of AI.  

What is the impact? 

Victims of TFGBV, particularly those who have experienced deepfakes and the non-consensual creation or sharing of sexualised nudified images, are forced to live with the consequences of others’ violent actions. These violations of bodily autonomy and personal identity can result in severe psychological harm, including depression and anxiety, and may lead victims to become isolated, fearful, and socially withdrawn – as well as in certain cases leading them to self-harm or suicidal tendencies.  

Further, the long-term impact of these cases are still being found. A victims sense of safety and well-being as well as trust can be severely impacted. The images can be extremely difficult to remove and often cannot be fully taken down due to the number of reshares. This can cause a continuation of harm, having lasting impacts on their lives – personally, professionally and beyond. 

For victims, there are many barriers in reporting this crime – laws are new and still being implemented by EU Member states – with the anonymity of perpetrators and the fast-paced resharing of these images. Accessing justice is a further challenge due to the cross-border nature of the crimes as well as the ability for perpetrators to hide behind the screen. They can create multiple accounts, use VPNs to hide their location and more to prevent their discovery.  

Secondary and repeat victimisation are a major risk to the victims also. The sexualised nudified image can be modified, reshared and remain online for years to come, meaning the harm is not a single incident but can become a repeated experience. Victims who seek support or wish to report this incident to support services or authorities may face secondary victimisation, including disbelief, minimisation of harm, or a lack of appropriate responses, which can further undermine their trust in institutions and deter future help-seeking.  

TFGBV is not a victimless crime, it is a violent attack on persons autonomy, safety and well-being.  

What to do? 

Stronger enforcement of EU law is needed to prevent repeat victimisation and ensure platforms uphold users’ rights.

Iqra KhanMember of VSE's Youth Ambassadors Platform

If you or someone you know falls victim to the nudification or sexualisation of your images, please contact your nearest support service for guidance. In many EU countries these activities are already illegal and can be reported to the police for investigation.  

Report any image and account who is requesting and sharing these images, regardless of jurisdiction.  

By reporting or seeking support following the non-consensual image creation or sharing, it can inform Member States on just how widespread this issue is. This can lead to better prevention strategies as will as the development of indicators to assess victim protection and service responses.  

Further, these figures can highlight the need for platform monitoring tools that track content removal, re-sharing, and the effectiveness of victim reporting mechanisms. 

Please contact your local, national and EU Representatives to demand higher standards in relation to TFGBV. This topic has been a growing concern at each level of governance, and today the European Parliament is expected to call for stronger and faster enforcement of EU laws to prevent the use of AI tools to create illegal sexual content. It’s important they hear our concerns; the number of victims is rapidly growing.  

*This is an example made up of actual requests made on the platform.