Paedophiles using AI to create realistic images of child exploitation.

AI has sparked a race to access child exploitation material on the dark web.

June 28th 2023.

Paedophiles using AI to create realistic images of child exploitation.
The rise of Artificial Intelligence (AI) technology has sparked a ‘predatory arms race’ among paedophiles. These criminals are using AI programs to generate lifelike images of child sexual abuse, raising concerns that it will undermine efforts to find victims and combat real-world abuse.

Rebecca Portnoff, the director of Data Science at the nonprofit child-safety group Thorn, told The Washington Post that children’s images, including those of known victims, are being ‘repurposed for this really evil output’. She added that the ease of using these tools and the realism of the images creates an even greater challenge for law enforcement’s efforts to find and protect children in harm’s way.

In the UK, a ‘pseudo image’ generated by a computer which depicts child sexual abuse is treated the same as a real image and is illegal to possess, publish or transfer. Ian Critchley, the National Police Chiefs’ Council lead on child safeguarding, said that these programs could allow paedophiles to move along a ‘scale of offending’ from thoughts, to synthetic, to actually the abuse of a live child.

The emergence of such images also has the potential to undermine efforts to find victims and combat real abuse, forcing law enforcement to go to extra lengths to investigate whether a photograph is real or fake. The Washington Post also reported that AI-generated child sex images could ‘confound’ the central tracking system built to block such material from the web due to the fact that it is designed only to catch known images of abuse, rather than detect newly-generated ones.

Furthermore, AI tools can re-victimize any individual whose photographs of past child sex abuse are used to train models to generate fake images. Some of the image creators are posting on a popular Japanese social media platform called Pixiv, which is mainly used by artists sharing manga and anime. However, because the site is hosted in Japan, where sharing sexualised cartoons and drawings of children is not illegal, these creators can share their work via groups and hashtags.

The subscription-based platform Patreon is also used to host the obscene images, with accounts selling AI-generated, photo-realistic images of children behind a paywall with different levels of pricing depending on the type of material requested. Journalist Octavia Sheepshanks told the BBC that the volume of these images is huge and that many people in these groups are sharing links to real child sex abuse material.

Patreon has a ‘zero-tolerance’ policy towards hosting child abuse images, real or otherwise. They also have dedicated teams, technology and partnerships to ‘keep teens safe’. Similarly, Pixiv has banned all photo-realistic depictions of sexual content involving minors on May 31.

The rise of AI technology has sparked a worrying trend among paedophiles, as the use of AI programs to generate lifelike images of child sexual abuse could undermine efforts to find victims and combat real-world abuse. Law enforcement officials are now being forced to spend time determining whether the images are real or AI-generated. It is crucial that platforms such as Patreon and Pixiv take a strong stance against these images and continue to work to protect children from such online predators.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]
[Generative AI is experimental.]

 0
 0