When Olivia was rescued by police at the age of eight, she thought her five-year rape ordeal was over.

But thanks to advances in artificial intelligence, her torment may never end.

Paedophiles are using AI software to generate new content and ‘bespoke scenarios’ using real-life images of her abuse, which were circulated online.

Olivia was rescued by police in 2013 after being repeatedly raped and sexually tortured from the age of three. 

Her abuser posted so many images that analysts from the Internet Watch Foundation (IWF) were seeing them every day through their work.

One victim of child sex abuse, Olivia, is now being tormented by her images being used to create new abuse material via AI tools

One victim of child sex abuse, Olivia, is now being tormented by her images being used to create new abuse material via AI tools

Now 20, Olivia is being victimised every time images of her abuse continues to be shared and sold online.

Offenders have created AI models for generating novel images of her which are available to download for free, allowing predators to generate pictures of her in any setting or sexual activity they can imagine.

In a new report released today, the IWF has also found models for generating AI material of celebrity children, warning there is a ‘new cottage industry of online criminals who are creating life-like child sexual abuse to order’.

A dark web forum user reportedly shared an anonymous webpage containing links to AI models for 128 different named victims of child sexual abuse.

Other fine-tuned models can generate AI child sexual material of celebrity children.

IWF chief executive Susie Hargreaves said: ‘Survivors of some of the worst kinds of trauma now have no respite, knowing that offenders can use images of their suffering to create any abuse scenario they want. 

Paedophiles are using AI software to generate new content and 'bespoke scenarios' using real-life images of abuse (stock photo)

Paedophiles are using AI software to generate new content and ‘bespoke scenarios’ using real-life images of abuse (stock photo)

‘Without proper controls, generative AI tools provide a playground for online predators.’

Marie Collins Foundation chief executive Victoria Green said: ‘To know that offenders can now use easily available AI technology to create and distribute further content of their abuse is not only sickening for victims and survivors, it causes immense anxiety. 

‘Victims and survivors have a right not to live in fear of revictimisation.’

The IWF added that the AI tools used to create the images remain legal in the UK, even though AI child sexual abuse images are illegal.

A spokesperson for the group said: ‘Although now free of her abuser, Olivia, like many other survivors, is repeatedly victimised every time imagery of her abuse continues to be shared, sold and viewed online.

‘This torment has now reached a new level because of the advent of generative text-to-image AI which is being exploited by offenders

‘Fine-tuned models like Olivia’s have been trained on the imagery that IWF analysts were seeing daily but despite best efforts were unable to eradicate.

‘This means that the suffering of survivors is potentially without end, since perpetrators can generate as many images of the children as they want.

‘The IWF knows, from talking to adults who have suffered repeated victimisation, that it’s a mental torture to know that their imagery continues to be circulated online.

‘For many survivors, the knowledge that they could be identified, or even recognised from images of their abuse is terrifying.’

IWF analysts found 90 per cent of AI images were realistic enough to be assessed under the same law as real child sexual abuse material (CSAM), and that they are becoming increasingly extreme.

It warned ‘hundreds of images can be spewed out at the click of a button’ and some have a ‘near flawless, photo-realistic quality’.

Ms Hargreaves added: ‘We will be watching closely to see how industry, regulators and Government respond to the threat, to ensure that the suffering of Olivia, and children like her, is not exacerbated, reimagined and recreated using AI tools.’

Richard Collard of the NSPCC said: ‘The speed with which AI generated child abuse is developing is incredibly concerning but is also preventable. 

IWF analysts found 90% of AI images were realistic enough to be assessed under the same law as real child sexual abuse material (CSAM), and that they are becoming increasingly extreme (stock photo)

IWF analysts found 90% of AI images were realistic enough to be assessed under the same law as real child sexual abuse material (CSAM), and that they are becoming increasingly extreme (stock photo)

‘Too many AI products are being developed and rolled out without even the most basic considerations for child safety, retraumatising child victims of abuse.

‘It is crucial that child protection is a key pillar of any Government legislation around AI safety. 

‘We must also demand tough action from tech companies now to stop AI abuse snowballing and ensure that children whose likeness are being used are identified and supported.’

A Government spokesperson said: ‘We welcome the Internet Watch Foundation report and will carefully consider their recommendations.

‘We are committed to further measures to keep children safe online and go after those that would cause harm, including where AI is used to do so.’

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Alleged drunk driver five times over the limit at McDonald’s drive-thru in Bolivar, Adelaide

Drink driver is found asleep at the wheel in a McDonald’s drive-thru…

Meghan Markle trashed by Sky News host Rita Panahi as Duchess ‘mocks’ meeting with The Queen

Australian TV host brutally calls out Meghan and Harry after watching their…

Arkansas cop, 31, arrested in child trafficking sting in Texas after trying to have sex with an undercover officer posing as underage girl

Arkansas cop, 31, arrested in child trafficking sting in Texas after trying…

United Airlines is slammed for leaving families stranded ahead of July 4

United Airlines has been slammed by families who have been stranded at…