AI encouraging new forms of violence against women

The report strengthens calls on the UK Government to include on AI chatbots in the Policing and Crime Bill

The report finds that AI chatbots are generating new forms of violence against women and girls
Author: George SymondsPublished 26th Mar 2026

A new report from Swansea University shows the first analysis of how AI chatbots are reshaping violence against women and girls (VAWG) in dangerous new ways.

The report strengthens calls for the UK Government action on AI chatbots in the Policing and Crime Bill.

The report finds that AI chatbots are generating new forms of violence against women and girls, and - for the first time - shows how platforms are enabling and encouraging gender-based violence through deliberate design choices and failures in safety mechanisms.

It identifies significant gaps in regulation and platform governance, with recommendations for reform of the Online Safety Act, criminal law, product safety legislation, as well as a new AI Act.

The report finds:

  • AI chatbots allow roleplays of incest, child sexual abuse and rape with few safeguards, risking the normalisation and legitimisation of this abuse
  • AI chatbots are creating new forms of violence and abuse, such as chatbot-driven abuse and simulations, requiring urgent action
  • AI chatbots are intensifying abuse such as stalking with detailed and personalised guidance, likely to escalate offending
  • AI platform design choices, policies and governance failures are encouraging and enabling violence against women and girls, with harms are not simply the result of user misuse
  • Existing regulation is wholly inadequate to prevent and address chatbot-VAWG
  • There is a shocking lack of research into how AI chatbots are implicated in violence against women and girls, raising significant concerns about the evidence base for future AI regulation

The authors say the report 'makes visible the very real harms and threats to the freedom and safety of women and girls'

Leading expert on violence against women and girls Professor Clare McGlynn said: “Our report warns that chatbot-VAWG represents a rapidly escalating threat. Without early intervention, these harms risk becoming entrenched and scaling quickly, mirroring the trajectory of other forms of tech-facilitated abuse such as deepfake and nudify apps, where early warnings were largely ignored. We must not make the same mistakes again.”

Principal Investigator on the project Professor Yvonne McDermott Rees, from Swansea University, said: “Crucially, existing legal regulation is patchy in its application to chatbot-related VAWG. Our report recommends the adoption of a new AI Safety Act, the creation of an online safety regulator, and the establishment of a right of action for AI harms to ensure victims can get the justice they deserve.”

Her colleague Professor Stuart Macdonald, founder and co-director of Swansea University’s Cyber Threats Research Centre (CYTREC), added: “Existing criminal laws fail to cover the full range of chatbot-VAWG harms. We therefore recommend a new criminal offence of dangerous deployment of an AI chatbots.

A UK Government spokesperson said:

"Violence against women and girls is a national emergency, and this report underlines that there is more to do. The Online Safety Act criminalised the sharing of intimate images without consent, and this government has gone further by criminalising their creation and moving to ban AI tools that generate deepfake sexual images of people without their consent.

“We are going further still: creating or requesting these images will be a priority offence under the Act so platforms must act to stop these images appearing in the first place - and remove them within 48 hours if they do.

“Our VAWG strategy sets out how we will halve these crimes within a decade, and we will not stop until women and girls are safe."

the full report can be read here.

Hear all the latest news from across the UK on the hour, every hour, on Greatest Hits Radio on DAB, smartspeaker, at greatesthitsradio.co.uk, and on the Rayo app.