Ending digital violence: reclaiming online spaces for women and girls
Violence against women and girls has moved into our devices. From image-based abuse to coordinated harassment and non-consensual synthetic media, these attacks silence, intimidate and exclude women from public life. The UN’s 16 Days of Activism this year names the problem plainly: “UNiTE to End Digital Violence Against All Women and Girls.”
Across Europe, careful recent analysis shows this is not a marginal problem. The European Institute for Gender Equality’s (EIGE) report on cyberviolence documents a wide spectrum of harms and highlights how the digital dimension magnifies risks and complicates remedies. Online abuse is not a niche tech problem - it is a core gender-equality issue.
What is the problem and how does existing legislation address it?
What exactly is cyberviolence against women and girls (CVAWG)? According to the EIGE report, “CVAWG is an intersectional form of violence with different patterns and levels of vulnerability: (1) cyber stalking, (2) cyber harassment, (3) cyber bullying, (4) online gender-based hate speech, and (5) non-consensual intimate image abuse.” However, there is no harmonised definition or legal response on the EU-level to the phenomenon yet. This article will focus specifically on the phenomenon of deepfake pornography.
Although the European Union has already taken important steps to tackle harmful content online, gaps persist. The Digital Services Act (DSA) obliges platforms to manage systemic risks and set clearer notice-and-action systems for illegal content, including non-consensual pornographic content. However, this requirement only tackles the issue retroactively and does not prevent the creation of such imagery.
Although the AI Act specifically references deepfakes, it does so in a way that is purely concerned with transparency, merely requiring them to be labelled, in order for citizens to be able to recognise their artificial nature.
However, the AI Act makes no mention of sexually-explicit deepfakes, depicting pornographic images of women and girls without their consent, especially not in its list of prohibited AI systems and practices. This highlights the failure to recognise that CVAWG is not a private problem and needs to be addressed on the political stage in order to enable women to fulfil their full potential in the public and private sphere.
The recently adopted Directive on combating violence against women addresses part of the problem. It criminalises the non-consensual production and dissemination of manipulated intimate material, including making a person appear to engage in sexual acts and sharing such content without their consent when it is likely to cause serious harm (Article 5). It also obliges Member States to provide victim support services, online reporting tools, removal orders and rapid evidence preservation. But as a Directive, it sets only minimum rules and leaves broad discretion in transposition and enforcement, risking fragmentation and uneven protection across borders.
In 2023, Deepfake pornography made up 98% of all deepfake videos online, with 99% of the individuals targeted in deepfake pornography being women, according to a report from Security Hero. The same report highlights that deepfake pornography is extremely easy and cheap to create, and emphasises that “74% of deepfake pornography users don't feel guilty about it.” This clearly highlights that CVAWG is anchored in the social inequality between men and women, requiring legal intervention.
Research by the European Sex Workers’ Rights Alliance (ESWA) reinforces this picture. Image-based sexual abuse, including manipulated intimate images, disproportionately affects marginalised groups. LGBTQIA+ people and sex workers face especially high rates, with sex workers also exposed to financial losses, outing, and retaliation. The report shows that victims often struggle to have abusive images removed because platforms respond slowly or are designed in ways that shift responsibility onto users. This form of victim-blaming forces individuals to take on heavy self-protection burdens instead of requiring platforms to prevent the abuse.
It remains crucial to distinguish between consensual adult sexual expression, including sex work, and non-consensual image-based abuse. Measures against deepfake pornography must be precise and grounded in consent and autonomy, not stigma, to avoid harming those whose livelihoods rely on digital spaces.
No more glitches: fixing Europe’s response
Volt demands the following elements to safeguard the rights of women and girls online, while recognising the rights and autonomy of sex workers:
The AI Act must include AI systems used to primarily create non-consensual deepfake pornography on its list of prohibitions. Sexually explicit synthetic content should only be lawful where the creator or distributor possesses verifiable, explicit consent from the person depicted. In all other cases, the content shall be presumed non-consensual and treated as image-based sexual abuse.
Member states should criminalise the creation and sharing of non-consensual deepfake pornography, following the example of the UK and South Korea, while recognising the rights of sex workers. Effective enforcement is entirely possible, as demonstrated by South Korea’s approach. Dedicated training for law-enforcement officials enables them to recognise manipulated sexualised content and respond appropriately. Specialised police cyber units should actively monitor platforms where non-consensual deepfake pornography is frequently shared, and collaborate with hosting services to identify perpetrators.
Directive (EU) 2024/1385 on combating violence against women must be rapidly and ambitiously transposed into national legislation and the DSA effectively enforced in a gender sensitive manner.
Cross-border victim support is essential. Digital abuse does not stop at borders; we need EU-funded rapid-response units, legal assistance funds, and faster takedown and evidence-preservation mechanisms so victims can secure remedies and protection quickly.
Platforms must be required to implement tools to help detect non-consensual or manipulated intimate imagery, which could be aided by AI, and reporting mechanisms, empowering citizens to report suspected non-consensual deepfake pornography.
Prevention and resilience must be funded. Education campaigns, digital literacy for girls and boys, and trauma-informed support services must be core lines in the next Multiannual Financial Framework and in national budgets.
Building the digital future we want
If we fail to act decisively now, digital violence will continue to entrench gender inequality, and silence the voices of women and girls both off- and online. Entire generations risk growing up in spaces where fear, humiliation and abuse are routine, undermining their ability to participate fully in public, professional and creative life.
By treating cyberviolence as a core gender-equality issue, Europe can reclaim digital spaces as safe, empowering and inclusive. The choices we make today will determine whether the internet becomes a tool for liberation or a platform for perpetuating harm - Volt must champion the former.
Opinion article by Nina Hamann, Policy Shaper for Women’s Rights & Gender Inclusivity
WRomCom (Women’s Rights & Gender Inclusivity Community)
______________
You and us share the same dream of a united, thriving Europe. It really means a lot to us when you make a donation, and if you would like to help us plan ahead with confidence, we thank you for your monthly contribution 💜